Dongwook Yoon is an Assistant Professor at the Department of Computer Science, University of British Columbia, and a member of IMAGER and Design for People. His research lies at the intersection of human-computer interaction (HCI), computer-supported cooperative work (CSCW), computer-mediated communication (CMC), augmented and virtual reality, and educational technology. He focuses on building rich collaboration systems that offer expressive multimodal interactions, i.e., interactions through multiple communication channels (e.g., speech, gesture, and grasp). His design approach translates natural human interactions into novel combinations of input modalities that serve as building blocks for fluid, rich, and lightweight interfaces. He deploys and evaluates high-fidelity systems in real world contexts (e.g., classrooms), from which we can obtain ecologically valid user data.
Working with me
I have openings for two graduate students to join my lab at MSc or PhD level for the 2018/19 academic year. If you are interested in Multimodal Interaction, Educational Technology, and Virtual / Augmented Reality, please apply!
UBC students at any level (B.S., M.S., Ph.D.) are welcomed to contact me for potential advising and collaboration. Two graduate research assistantship or undergraduate research internship positions are currently available in my lab. Here is a detailed description about the ongoing projects:
I would expect that you, as a potential research collaborator, can understand the basic languages of HCI, software development, and statistics. No worries. It’s OK not to be proficient at all the domains, but if you are inexperienced, please consider taking the introductory courses at UBC (e.g., CPSC 344, 544) or MOOCs. It will help you figure out what HCI is about and whether you really want to do HCI research.
If you do contact me, please make sure to clarify the following three points.
- First, who you are. I want to know which year and program you are in and what kind of skill set you have. If you want me as a committee member, please indicate your supervisor. Attaching your CV, resume, or portfolio is the easiest way to go.
- Second, what you need. Are you looking for an advisor, a committee member, a funded RAship, or just research experience? Please be up front about what you want to get out from the meeting. If needs be, please share your academic timeline.
- Third, your research interest. What do you want to study and why do you think I might be a good person to work with? To answer these, you would need to watch the videos of my projects, and then read the papers that you consider particularly interesting. It’d be great if you can propose a solid research idea. If you don’t have a clear idea, it’s fine. Please state your area of interest instead, then I might suggest you a research question or invite you to work on an on-going project.
If you have reached at this line, you are perhaps genuinely interested in working with me. I invite you to send me an email to ask for a meeting. You can look up my free/busy schedule to find a time slot that will work for both of us. If we luck out, we will work together to design, build, and test cool new interactive systems. If you have self-motivation, commitment, and passion for quality research, I will find resources to support your work and give you feedback to advance your practice.
University of British Columbia
CPSC 554Y: Topics in Human-Computer Interaction – Multimodal Interaction
Term 2 2017-18 (Jan - Apr)
CSCW 2016, UIST 2014
|A multi-modal annotation system supports rich collaboration through voice + gesture expressions.|
|A transcription-based voice editing system reduces workloads in speech commenting.|
Grasp + Micro-mobility
|Employing capacitive grip sensing and inertial motion can capture contexts of collaborative interactions.|
|A pen + touch gesture expands writing space in a fluid document layout.|
CHI WIP 2013
|A multi-touch gesture supports nonlinear page navigation through a set of lightweight bookmarking interactions.|
|Using the smartphone as a 3D controller supports direct manipulation of an on-screen virtual object.|
US Patent #8878875
|An augmented reality e-book interaction technique supports tangible interactions for page navigation.|
Facilitating Complex Referencing of Visual Materials in Asynchronous Discussion Interface
Soon Hau Chua, Toni-Jan Keith Palma Monserrat, Dongwook Yoon, Juho Kim, and Shengdong Zhao
CSCW 2018 Full Paper · Conditionally Accepted
TypeTalker: A Speech Synthesis-Based Multimodal Commenting System
Ian Arawjo, Dongwook Yoon, and François Guimbretière
CSCW 2017 Full Paper · Video
SimpleSpeech: Simplified Audio Production in Asynchronous Voice-Based Discussions
Venkatesh Sivaraman, Dongwook Yoon, and Piotr Mitros
CHI 2016 Full Paper · Video
RichReview++: Deployment of a Collaborative Multimodal Annotation System for Instructor Feedback and Peer Discussion
Dongwook Yoon, Nicholas Chen, Bernie Randles, Amy Cheatle, Steven Jackson, Corinna Loeckenhoff, Abigail Sellen, and François Guimbretière
CSCW 2016 Full Paper · Video
Sensing Tablet Grasp + Micro-mobility for Active Reading
Dongwook Yoon, Ken Hinckley, Hrvoje Benko, François Guimbretière, Pourang Irani, Michel Pahud, and Marcel Gavriliu
UIST 2015 Full Paper · Video
RichReview: blending ink, speech, and gesture to support collaborative document review
Dongwook Yoon, Nicholas Chen, François Guimbretière, and Abigail Sellen
UIST 2014 Full Paper · Video
TextTearing: opening white space for digital ink annotation
Dongwook Yoon, Nicholas Chen, and François Guimbretière
UIST 2013 Short Paper · Video
Mobiature: 3d model manipulation technique for large displays using mobile devices
Dongwook Yoon, Joong Ho Lee, Kiwon Yeom, and Ji-Hyung Park
ICCE 2011 Short Paper · Video