Hi there! I am a PhD student studying Human-Computer Interaction (HCI) in the DGP lab at the University of Toronto, supervised by Prof. Ravin Balakrishnan and Prof. Tovi Grossman. My research interests are mainly in AR/VR interactions, freehand interactions, 3D user interfaces, and technology-assisted learning.

Please contact me for CV.


"I Choose Assistive Devices That Save My Face": A Study on Perceptions of Accessibility and Assistive Technology Use Conducted in China

Franklin Mingzhe Li, Di Laura Chen, Mingming Fan, and Khai N. Truong. Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (CHI), May 2021.

Disambiguation Techniques for Freehand Object Manipulations in Virtual Reality

Di Laura Chen, Ravin Balakrishnan, and Tovi Grossman. IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR), March 2020.
[PDF] [Video] [Presentation]

FMT: A Wearable Camera-Based Object Tracking Memory Aid for Older Adults

Franklin Mingzhe Li, Di Laura Chen, Mingming Fan, and Khai N. Truong. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), September 2019.

Integrating Multimedia Tools to Enrich Interactions in Live Streaming for Language Learning

Di Laura Chen, Dustin Freeman, and Ravin Balakrishnan. Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (CHI), May 2019.


Freehand Interactions in AR

I implemented a set of freehand interaction techniques by combining a hand-tracking sensor with an AR head-mounted display. Users can directly interact with virtual objects with bare hands in physically realistic ways.

AR for Dance and Sports Training

An AR application developed on Microsoft Hololens that assists users in learning movements, such as in dance or sports. We used motion-captured data of real instructors performing movements to animate a holographic virtual trainer. We explored a series of visualization techniques to facilitate the learning, including footprints, body motion curves, and keyframes.
[Slides (videos included)]

Visualizing Food Ingredients Based On Consumer Interests

Ingredient labels are crucial in helping consumers make informed decisions about the food they buy and eat. However, ingredient labels are often difficult to understand. IngVis is an interactive application that I developed for visualizing food ingredients that dynamically displays ingredient information based on varying consumer needs.

3D Modelling and Fabrication of Multi-Purpose Tool

Designed and fabricated a multi-purpose, foldable cutlery set. Performed 3D modelling of the tool using OpenSCAD, built a Python application interface to make features of the tool customizable, 3D-printed the components and assembled the tool.

UofT Timetable Generator

The UofT Timetable Generator is an Android application that helps University of Toronto students compose their course schedules based on the university's publicly posted calendar. We devised and tested an efficient algorithm to select the best timetable schedule based on chosen courses and other user settings, such as preferring morning vs. night sections.
[Website] [Video]


Sept 2016 - Present

Teaching Assistant

Teaching assistant for CSC108 and CSC318

May 2022 - Dec 2022

Research Intern

Worked on evaluating gesture learning in VR through analysis of physical movements

May 2021 - Sept 2021

Course Instructor

Course instructor for CSC108: Introduction to Computer Programming

May 2020 - Sept 2020

Research Intern

Chatham Labs
(acquired by Meta)

Worked on object depth estimation in VR using eye gaze and controller raycasting

May 2014 - Aug 2015

Mobile Application Developer Intern

(acquired by Naver)

Worked on Android and iOS mobile app development


  • chendi@dgp.toronto.edu