I am an Assistant Professor at the School of Computer Science, University of Waterloo. I direct the WatVis (Waterloo Visualization) research group. I am affiliated with the WaterlooHCI lab. I am also a member of the Waterloo Artificial Intelligence Institute (Waterloo.ai) and the Waterloo Institute for Sustainable Aeronautics (WISA). I received my Ph.D from the Department of Computer Science, University of Toronto, where I was affiliated with the Dynamic Graphic Project (DGP) lab.

My research focuses on the areas of Information Visualization (InfoVis), Human-Computer Interaction (HCI), Visual Analytics, and Data Science. I develop novel interactive systems and visual representations that promote the interplay between humans, machines, and data. My research aims to boost the efficiency of human-data interaction with exploratory and explanatory interfaces that tightly integrate the flexibility and creativity of users with the scalability of algorithms and machine learning.

I am always seeking highly-motivated and hard-working PostDoc, Ph.D., M.Math., and undergraduate students to work with my team on exciting research projects. Please read the Prospective page before contacting me; otherwise, you will unlikely get a response.

Recent and Best

Piet (CHI'24)
Fluid color manipulation for motion graphics videos.
Mondrian (CHI'24)
Exploring the abstraction-driven and concrete-driven approaches for image color authoring.
EmoWear (CHI'24)
Expressing your emotions for voice messages on smartwatches using animated chat balloons.
Visual guidance in VR (CHI'24)
Investigating various visual encodings for guiding precise bare hand gestures in VR/AR.
CoPrompt (CHI'24)
Collaboratively prompting with generative AI for programming tasks.
Gestures for earables (IMWUT'24)
Exploring regions for mid-air and on-skin gestures for earable devices.
Streaming in VR (ISS'23)
Investigating practices, challenges, and opportunities of VR streaming on Twitch.
Eggly (IMWUT'23)
Empowering nerofeedback training for ASD children with mobile augmented reality games.
De-Stjil (CHI'23)
Black, yellow, or blue? Crafting the colors of your graphics designs with De-Stjil in a breeze!
Slide4N (CHI'23)
Making slides right from Jupyter notebooks with tight human-AI collaboration.
VibEmoji (CHI'22)
Authoring multi-modal emoticons consiting of stickers, animation effects, and vibrotactile patterns.
ChartSeer (TVCG'20)
Exploratory visual analysis with chart recommendations.
Hand-over-face gestures (MobileHCI'19)
Interacting with your smartphone through the built-in camera.
KTGraph (VAST'17)
Studying handoff of partial findings in asynchronous collaborative sensemaking.
EgoLines (CHI'16)
Visual exploration of dynamic egocentric networks.
MatrixWave (CHI'15)
Visual comparison of event sequence data, such as user clickstreams on websites.
FluxFlow (VAST'14)
Visual analysis of anomalous information spreading in social media.

Sponsors

Thank the following sponsors for kindly offering cash and in-kind contributions to our research!