Dongwook's headshot

Dongwook Yoon is an Assistant Professor at the Department of Computer Science, University of British Columbia, and a member of IMAGER and Design for People. His research lies at the intersection of human-computer interaction (HCI), computer-supported cooperative work (CSCW), computer-mediated communication (CMC), and educational technology. He focuses on building rich collaboration systems that offer expressive multimodal interactions, i.e., interactions through multiple communication channels (e.g., speech, gesture, and grasp). His design approach translates natural human interactions into novel combinations of input modalities that serve as building blocks for fluid, rich, and lightweight interfaces. He deploys and evaluates high-fidelity systems in real world contexts (e.g., classrooms), from which we can obtain ecologically valid user data.

For further information, please look into my curriculum vitæ, research statement, and teaching statement.

Working with me

Interested in doing HCI research with me? UBC students at any level (B.S., M.S., Ph.D.) are welcomed to contact me for potential advising and collaboration. I also can serve as a committee member for students in other departments at UBC or in other schools. If you are looking for information about the graduate admission process at CS, please contact our graduate program.

I would expect that you, as a potential research collaborator, can understand the basic languages of HCI, software development, and statistics. No worries. It’s OK not to be proficient at all the domains, but if you are inexperienced, please consider taking the introductory courses at UBC (e.g., CPSC 344, 544) or MOOCs. It will help you figure out what HCI is about and whether you really want to do HCI research.

If you do contact me, please make sure to clarify the following three points.

  • First, who you are. I want to know which year and program you are in and what kind of skill set you have. If you want me as a committee member, please indicate your supervisor. Attaching your CV, resume, or portfolio is the easiest way to go.
  • Second, what you need. Are you looking for an advisor, a committee member, a funded RAship, or just research experience? Please be up front about what you want to get out from the meeting. If needs be, please share your academic timeline.
  • Thrid, your research interest. What do you want to study and why do you think I might be a good person to work with? To answer these, you would need to watch the videos of my projects, and then read the papers that you consider particularly interesting. If you have solid research ideas, please describe them with brevity. If you don’t have a clear idea, it’s fine. Please state your area of interest instead, then I might suggest you a research question or invite you to work on an on-going project.

If you have reached at this line, you are perhaps genuinely interested in working with me. I invite you to send me an email to ask for a meeting. You can look up my free/busy schedule to find a time slot that will work for both of us. If we luck out, we will work together to design, build, and test cool new interactive systems. If you have self-motivation, commitment, and passion for quality research, I will find resources to support your work and give you feedback to advance your practice.


University of British Columbia

CPSC 554Y: Topics in Human-Computer Interaction – Multimodal Interaction
Term 2 2018 (Jan)

Cornell University

CS 1110: Introduction to Computing Using Python (w/ Lillian Lee, Steve Marschner, and Walker White)
Spring 2013 · Spring 2014 · Fall 2014


RichReview CSCW 2016, UIST 2014

RichReview thumbnail A multi-modal annotation system supports rich collaboration through voice + gesture expressions.

SimpleSpeech CHI 2016

SimpleSpeech thumbnail A transcription-based voice editing system reduces workloads in speech commenting.

Grasp + Micro-mobility UIST 2015

Grasp Tablet thumbnail Employing capacitive grip sensing and inertial motion can capture contexts of collaborative interactions.

TextTearing UIST 2013

TextTearing thumbnail A pen + touch gesture expands writing space in a fluid document layout.

TouchBookmark CHI WIP 2013

TouchBookmark thumbnail A multi-touch gesture supports nonlinear page navigation through a set of lightweight bookmarking interactions.

Mobiature ICCE 2012

Mobiature thumbnail Using the smartphone as a 3D controller supports direct manipulation of an on-screen virtual object.

PhantomBook US Patent #8878875

PhantomBook thumbnail An augmented reality e-book interaction technique supports tangible interactions for page navigation.

Refereed Publications

Facilitating Complex Referencing of Visual Materials in Asynchronous Discussion Interface
Soon Hau Chua, Toni-Jan Keith Palma Monserrat, Dongwook Yoon, Juho Kim, and Shengdong Zhao
CSCW 2018 Full Paper · Conditionally Accepted

TypeTalker: A Speech Synthesis-Based Multimodal Commenting System
Ian Arawjo, Dongwook Yoon, and François Guimbretière
CSCW 2017 Full Paper · Video

SimpleSpeech: Simplified Audio Production in Asynchronous Voice-Based Discussions
Venkatesh Sivaraman, Dongwook Yoon, and Piotr Mitros
CHI 2016 Full Paper · Video

RichReview++: Deployment of a Collaborative Multimodal Annotation System for Instructor Feedback and Peer Discussion
Dongwook Yoon, Nicholas Chen, Bernie Randles, Amy Cheatle, Steven Jackson, Corinna Loeckenhoff, Abigail Sellen, and François Guimbretière
CSCW 2016 Full Paper · Video

Sensing Tablet Grasp + Micro-mobility for Active Reading
Dongwook Yoon, Ken Hinckley, Hrvoje Benko, François Guimbretière, Pourang Irani, Michel Pahud, and Marcel Gavriliu
UIST 2015 Full Paper · Video

RichReview: blending ink, speech, and gesture to support collaborative document review
Dongwook Yoon, Nicholas Chen, François Guimbretière, and Abigail Sellen
UIST 2014 Full Paper · Video

TextTearing: opening white space for digital ink annotation
Dongwook Yoon, Nicholas Chen, and François Guimbretière
UIST 2013 Short Paper · Video

Mobiature: 3d model manipulation technique for large displays using mobile devices
Dongwook Yoon, Joong Ho Lee, Kiwon Yeom, and Ji-Hyung Park
ICCE 2011 Short Paper · Video