Profile Photo

Hi, I'm Matthew.

  • User Experience Researcher
  • Computer Science (MSc)
  • Cognitive Systems (BA)

I have a passion for understanding how people feel, think, and act in order to inform the design of meaningful technology and services. I am experienced in user research skills such as qualitative/quantitave analysis, research design, and prototyping.


I enjoy photography as it makes me observant of human moments. I like films spanning from David Lynch to Korean thrillers. I play video games to experience ideas in an interactive manner.

I think life is only interesting if you engage with the people and the mediums surrounding them.

Featured Projects

  • All
  • VR
  • Haptics
  • UX
Mirrored Reality Yoga
Research - Perception
Haptic Design Practices and Tools
Research - Development - Haptics
Marking Life’s Moments
Research - UX
VR Time Perception
Research - Development - Perception
Crowdsourcing Haptic Effects
Research - Development - Haptics
Collaboration in Cloud Services
Research - UX

Mirrored Reality Yoga

Exploring How We Can See and Act Physically Using Virtual Reality
Term
UBC 2014 Winter Term 2
Materials
Keywords
  • Human Computer Interaction
  • Virtual Reality
  • Augmented Reality
  • Perception
  • Embodied Learning
  • Unity

What if people could see themselves as others would, how would behaviours change as a result?


This question served as the inspiration for an undergraduate Cognitive Systems research course (COGS 402), that sought to investigate questions based on human perception and knowledge processing.


To address this question, my project partner and I utilized a Unity tool that allowed for conventional web cameras to be used in a Unity game environment. We used an Oculus Rift DK1 as a “mirror” for individuals to see themselves through the lens of the camera. We collaborated with a Yoga instructor and learned that conveying posture correction information in terms of posture angles could serve as a pragmatic goal for a “mirrored” reality system. The project culminated in a public research showcase that allowed attendees to safely try some simple yoga poses while they wore the VR headset. While participants did face challenges in performing the yoga tasks, many were enthusiastic and amused by the possibilities of being able to see themselves as others would.

Credit: Tested. An example of viewing oneself in the third person using a VR system with a camera.

A participant performing a Yoga pose using a VR headset. The participant can see themselves in a third person perspective. Faces blurred to protect identities.

VR Time Perception

Reducing the Perception of Latency
Time Period
2014 to 2015
Keywords
  • Human Computer Interaction
  • Virtual Reality
  • Research Methodology
  • Unity

From 2014 to 2015, I had the pleasure of working at a Japanese research company called Nippon Telegraph and Telephone (NTT). The company researched and developed many technologies relevant to media services. For example technologies such as VR/AR streaming, resolution upscaling, and high resolution video compression were researched and developed for the 2020 Tokyo Olympic games.


My role was to investigate the effects of time perception in virtual 360 degree video environments, where the goal was to find a way to reduce the perception of latency as different patches of the video environment were slowly being buffered. A novel approach was developed in which the unready video patches would subtly move away from the current gaze of the viewer until they were ready for viewing. This project was implemented in Unity and utilized the Oculus Rift DK2.


It was found that the effectiveness of the technique would be greatly affected by the type of motion being viewed at a given time (eg. viewer following the movement of a ball, or viewing a static person). The results of the project were presented at a domestic Japanese conference called IEICE (The Institute of Electronics, Information, and Communication Engineers). The associated publication can be found in the left panel links.

Credit: Virtual Reality Times. Image modified to demonstrate “unbuffered” patches problem.

Crowdsourcing Haptic Effects

Obtaining Large Scale User Feedback
Time Period
2015
Materials
Keywords
  • Human Computer Interaction
  • Haptics
  • Phone Development
  • Research Methodology
  • JavaScript

In design, collecting feedback is critical. While it can be straightforward to collect feedback on visualizations and sounds, how can designers collect feedback for haptic effects that can only be physically felt in person?


The Sensory Perception and Interaction (SPIN) lab was determined to help haptic designers collect and utilize data of designed vibrotactile effects in order to see what end-users thought about them. As such, my role involved the creation of phone browser compatible vibration effects (proxies) that aimed to retain the original information and affective qualities of higher fidelity effects such that they could be deployed on crowdsourced services such as Amazon’s Mechanical Turk.


To do this, I created JavaScript files that would activate a mobile phone’s vibration motor to be on/off for certain durations of time. Through much iteration and team feedback, I created scripts that would characterize the different characteristics of vibrations such as intensity, roughness, ramps and oscillations.


Phone vibration proxy example.

It was found that these new versions of the effects successfully retained their original affective intentions. The findings of this work were published at a top-tier HCI conference called the Conference on Human Factors in Computing Systems (CHI). The associated publication can be found in the left panel links.


Haptic Design Practices and Tools

Conceptual and Technical Design Support for Haptic Designers
Time Period
2016 to 2020
Keywords
  • Human Computer Interaction
  • Haptics
  • Research Methodology
  • ReactJs

Haptic technology could act as the key in actualizing the physicality expected in many experiences such as in virtual reality, gaming, education and affective communication.


However, with so many types of haptic technologies out there, such as smartphones/watches, game controllers, and more, what is the best means for designing with haptic technology to support a physical experience?


Credit: Forbes (Left), Hacker Noon (Middle), Top Chartex (Right).


Being able to feel physical objects, metaphors of emotions, and constraints could be helpful for VR, affective communication, and physical gesture training.

Credit: IT Pro Today (Top Left), Stephen Brewster (Top Right), My Nintendo News (Bottom Left), Tech Spot (Bottom Right).


The many types of haptic technologies out there.

This challenge is what I sought to address in my time as a student at the Sensory Perception and Interaction (SPIN) group at the University of British Columbia by researching the conceptual and technical design processes of haptic interactions.


In one project, I worked with a PhD student on a haptic effect prototyping tool called Macaron, that allowed haptic designers to create, refine, and mix elements of different vibrotactile effects to quickly create new effects for smartphones and smartwatches. My role was to develop quality of life features such as saving/loading effects using the React framework. The tool is linked to the left panel.


In another project, I worked with another PhD student to research and conceptually design an interface for tuning the affective and pragmatic qualities of vibrations such that they would meet the demands of various situations that smartphone/watch users would appreciate (eg. useful notifications, affective communication, etc.). The associated publication is linked to the left panel.


How haptic designers and end users could conceptually benefit from tuning haptic effects parametrically.

I also worked with secondary school students to identify possible ways for haptic technology to support educational experiences. We analyzed existing STEM and classroom approaches, in order to ideate possible haptic designs that would teach STEM concepts in a physical manner. The students practiced core HCI research methodologies such as user interviews, cognitive walkthroughs, focus group interviews, qualitative analysis and much more.


Credit: UBC SPIN Lab (Magic Pen, Bottom), Haply Robotics (Haply, Top). An example conceptual sketch of how 2 haptic devices could be used for educational purposes. Design created by secondary school students.

My Master’s thesis systematically assessed the literature of different academic communities and compared their design practices in creating haptic applications. Specifically, the interactive design and engineering/psychological communities were characterized on the technical and conceptual aspects of designing haptic applications. The results suggested different ways in which each community's strengths/gaps in haptic design could be bolstered by the practices of another community.


Marking Life’s Moments

Unobtrusively Documenting Life
Term
UBC 2014 Winter Term 2
Materials
Keywords
  • Human Computer Interaction
  • Research Methodology
  • Life Documentation
  • Wearable

Life is full of wonderful fleeting moments, but how can we document it without obtrusively pulling out a phone, or awkwardly pulling out a notebook to write down how we feel about a moment?


Credit: Canon. Life is full of fleeting moments, how can one find the time to document how they feel?

This concept served as the inspiration for the "Lifemarks" project in an undergraduate course in advanced Human Computer Interaction techniques (CPSC 444). In a team, we followed a design thinking-like process for systematically understanding possible user needs and approaches to life documentation, and used this knowledge to create new unobtrusive ways to document life’s passing moments.


Our solution consisted of a simple wearable device, powered by Arduino, that could be “clicked” to mark an important moment in a day. Users could then later revisit the mark via a web interface that would have sufficient metadata that could trigger the necessary memory to fill in the remaining moment details.

The Arduino hardware used to prototype Lifemarks.

The Lifemarks interface for editing details.

The main button to discretely create a Lifemark.

Creating a Lifemark with the click of a button.


Collaboration in Cloud Services

Understanding How People Collaborate in Shared File Systems
Term
UBC 2017 Winter Term 2
Materials
Keywords
  • Human Computer Interaction
  • Research Methods
  • Personal Data Management
  • Collaboration

With life and work being shared online, issues in organization are bound to occur. For example, when relationships end, should shared files live or die? Or in group projects, how would one negotiate file deletion in the case of saving personal data space?


These were the types of questions that drove our team to understand how collaboration currently occurs in various online file sharing systems (Google Drive, Dropbox) in order to obtain a sense of how these systems can be redesigned to suit the nuances of file sharing under different social circumstances.


For a graduate HCI project course (CPSC 554m), our team conducted an extensive field study complete with interviews and qualitative analysis to understand different users and how they respectively approached shared file collaboration.


In terms of insights, it was found that turn-taking was an essential activity that was currently reliant on a seperate communication service (eg. email, Facebook Messenger, Slack). Additionally, a notion of role-taking (whether implicit or explicit) was often conducted by team members in activities such as organizing files/tasks. The notion of ownership of online shared files was overall found to be fluid and dynamic (eg. earned through contributions of content, whoever created the original files, etc.). Conflicts in version tracking, and file naming/organizing conventions were also found to be common problems.


We used these findings to create various conceptual prototypes that addressed our insights of file sharing behaviour.

User Activity Overview - This conceptual view of a Google Drive file system allows users to identify who is actively working on certain aspects of a project’s files without having to use external communication.

Place/Time Tab Overview - This conceptual view of a Google Drive file system allows users to understand the files associated with different places/times and actions.


Contact Information

Email:

mchun345[at]gmail[dot]com

LinkedIn:

Profile Page

© 2020 Matthew Chun