Hi! I am Ruolin Wang (Violynne), a Research Assistant and Ph.D. student in the Human-Computer Interaction Research Group at University of California, Los Angeles, advised by Prof.  Xiang 'Anthony' Chen. Prior to joining UCLA, I worked in the Tsinghua Pervasive HCI Group . I received my M.Sc degree in Computer Science from Tsinghua University and BEng degree in Microelectronics from Tianjin University.  

 

The next generation of HCI will witness the intertwined evolution of human and machine. Machines will be augmented by extending sensing abilities, interconnecting with the environment and learning from biological mechanism such as neural systems; human will be augmented by embracing more possibilities of interacting with the world and having machines intervened in better understanding the human body, physical and mental state. My career blueprint as an Interdisciplinary Researcher is working at the intersection of Sensing, AI and Brain Science, augmenting the abilities of both digital devices and humankind to provide seamless experience and improve wellbeing. By creating novel interaction techniques and deploying elegant solutions for Real-World users, I pursue intellectual impact as well as practical values. [CV]

Theme | Assistive Technology

Since June 2017, I have devoted most of my passion and energy to building assistive technologies toward blind and visually impaired people. Adopting a perspective of ability-based design, we can make the technologies accessible for a much wider range of users. In a broader sense, I see the potential of augmentation beyond accessibility. 

EarTouch👂

Novel Interaction + Capacitive Sensing

SmartTouch

Interaction Proxy + Fabrication

Ongoing 💬

Information Retrieval + Q&A System

Ongoing 🔍

Computer Vision + Multi-model Feedback


EarTouch: Facilitating Smartphone Use for Visually Impaired People in Public and Mobile Scenarios

Ruolin Wang, Chun Yu, Xing-Dong Yang, Weijie He, Yuanchun Shi ( CHI 2019, Best Paper Honorable Mention 🏅)

Interacting with a smartphone using touch and speech output is challenging for visually impaired people in public and mobile scenarios, where only one hand may be available for input (e.g. with the other one holding a cane) and privacy may not be guaranteed when playing speech output using the speakers. EarTouch is a one-handed interaction technique that allows the users to interact with a smartphone by touching the ear on the screen. Users hold the smartphone in a talking position and listen to speech output from ear speaker privately. Eight ear gestures were deployed to support seven common tasks including answering a phone call, sending a message, map navigation, etc. EarTouch also brings us a step closer to the inclusive design for all users who may suffer from situational disabilities. 

Theme | Medicine and Health Care

The potential of machines in helping human to lead wellbeing lives is still to be explored no matter in professional level or personal level.

Ongoing📒

Mental Health + AI

BrainQuake🧠

Epilepsy + ML

AirEx🌬️

Wearable Filtration System + Mobile App

Infusion System💉

Infrared Sensing + Database Management


Theme | Ubiquitous Sensing

The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.

Synchronize Selection👏

Wireless Sensing + User Behavior Modeling

Advanced Materials🔬

in exploration

Fluidic Interfaces 🌊

in exploration


Theme | Exploring Brain Science

Is our brain smart enough to understand the brain? My curiosity for brain is always my driving force for doing research.

Synesthesia🎇

Blending of the senses

Artificial Eye 👀

Implants designed to restore the sight


📧Contact: violynne@ucla.edu

Latest News

Feb 7 2020: One work accepted by CHI LBW.

Oct 25 2019: One work accepted by IMWUT.

Oct 22 2019: One work presented at UIST SIC.

Sep 20 2019: One paper submitted to CHI.

Aug 15 2019: One paper submitted to IMWUT.

July 30 2019: My first time attending SIGGRAPH.

 

Collaboration

UCLA Depression Grand Challenge

UCLA Disabilities and Computing Program

Interconnected & Integrated Bioelectronics Lab @ Electrical and Computer Engineering, UCLA

NLP Group @ Computer Science, UCLA

Wearable Bioelectronics Research Group @ Bioengineering, UCLA

Laboratory for Clinical and Affective Psychophysiology @ Psychology, UCLA

 

Service

Accessibility Co-chair (UIST 2020)

Associate Chair (CHI LBWs 2020)

Reviewer (CHI 2020, MobileHCI 2020, IMWUT 2020)

Student Volunteer (UIST 2019)

Volunteer at Beijing Volunteer Service Foundation and the China Braille Library (2018)

 

Teaching

ECE 209AS Human-Computer Interaction, UCLA (2019 Fall)

 

Honors & Awards

Graduates with distinction & Outstanding Thesis Award , Tsinghua University 2019

Best Paper Honorable Mention Award (Top 5%), CHI 2019

National Scholarship (Top 1%), Ministry of Education of the People’s Republic of China, 2018

Second Prize, Tsinghua University 35th Challenge Cup, 2018

Comprehensive Scholarship (Top 4%), Tsinghua University, 2017

First Prize, GIX Innovation Competition, 2016

Outstanding Thesis Award, Tianjin University, 2015