Hi! I am Ruolin Wang (Violynne), a Research Assistant and Ph.D. student in the Human-Computer Interaction Research Group at University of California, Los Angeles, advised by Prof.  Xiang 'Anthony' Chen. Prior to joining UCLA, I worked in the Tsinghua Pervasive HCI Group . I received my M.Sc degree in Computer Science from Tsinghua University and BEng degree in Microelectronics from Tianjin University.  

 

The next generation of HCI will witness the intertwined evolution of human and machine. Machines will be augmented by extending sensing abilities, interconnecting with the environment and learning from biological mechanism such as neural systems; human will be augmented by embracing more possibilities of interacting with the world and having machines intervened in better understanding the human body, physical and mental state. My career blueprint as an Interdisciplinary Researcher is working at the intersection of Sensing, AI and Neuroscience, augmenting the abilities of both digital devices and humankind to provide seamless experience and improve wellbeing. By creating novel interaction techniques and deploying elegant solutions for Real-World users, I pursue intellectual impact as well as practical values. [CV]

Theme | Assistive Technology

Since June 2017, I have devoted most of my passion and energy to building assistive technologies toward blind and visually impaired (BVI) people. Adopting a perspective of ability-based design, we can make the technologies accessible for a much wider range of users. In a broader sense, I see the potential of augmentation beyond accessibility. To be specific, my works focus on:

  • Natural user interface to enhance accessibility of smartphone (EarTouch, SmartTouch)
  • Wearable devices to enhance interaction with surroundings (AuxiScope, PneuFetch)
  • Information Retrieval and Visual Interpretation (Ongoing)

EarTouch👂

Novel Interaction + Capacitive Sensing

SmartTouch

Interaction Proxy + Fabrication

Ongoing 💬

Information Retrieval + Q&A System

Ongoing 🔍

Computer Vision + Multi-model Feedback


EarTouch: Facilitating Smartphone Use for Visually Impaired People in Public and Mobile Scenarios

Ruolin Wang, Chun Yu, Xing-Dong Yang, Weijie He, Yuanchun Shi ( CHI 2019, Best Paper Honorable Mention 🏅)

Interacting with a smartphone using touch and speech output is challenging for visually impaired people in public and mobile scenarios, where only one hand may be available for input (e.g. with the other one holding a cane) and privacy may not be guaranteed when playing speech output using the speakers. EarTouch is a one-handed interaction technique that allows the users to interact with a smartphone by touching the ear on the screen. Users hold the smartphone in a talking position and listen to speech output from ear speaker privately. Eight ear gestures were deployed to support seven common tasks including answering a phone call, sending a message, map navigation, etc. EarTouch also brings us a step closer to the inclusive design for all users who may suffer from situational disabilities. 

Theme | Medicine and Health Care

The potential of machines and AI in helping human to lead wellbeing lives is still to be explored.

  • Professional tools for hospitals (BrainQuake, Infusion+)
  • Personal tools for health care (Ongoing, AirEx)

Ongoing📒

Mental Health + AI

BrainQuake🧠

Epilepsy + ML

AirEx🌬️

Wearable Filtration System + Mobile App

Infusion+💉

Infrared Sensing + Database Management


Theme | Ubiquitous Sensing

"The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it." 

  • Synchronizing behaviors for novel interactions
  • Bioelectronics for novel sensing

Tap-to-Pair👏

Wireless Sensing for Device Association

Synchronize Selection😉

MultiModal User Behavior Modeling

Advanced Materials🔬

in exploration

Fluidic Interfaces 🌊

in exploration


Theme | Neuroscience

Is our brain smart enough to understand the brain? My curiosity for brain is always my driving force for doing research. I am interested in:

  • Synesthesia: the blending of the senses
  • Artificial Eye: implants designed to restore the sight

📧Contact: violynne@ucla.edu

Latest News

May 6 2020: One work submitted to UIST.

March 20 2020: Finished a course on Bioelectronics.

Feb 7 2020: One work accepted by CHI LBW.

Dec 13 2019: Finished a course on Neuroscience.

Dec 8 2019: One work rejected by CHI.

Oct 25 2019: One work accepted by IMWUT.

Oct 22 2019: One work presented at UIST SIC.

Sep 20 2019: One paper submitted to CHI.

Aug 15 2019: One paper submitted to IMWUT.

July 30 2019: My first time attending SIGGRAPH.

 

Collaboration

UCLA Depression Grand Challenge

UCLA Disabilities and Computing Program

NLP Group @ Computer Science, UCLA

Laboratory for Clinical and Affective Psychophysiology @ Psychology, UCLA

ACE, Makeability, Make4all @ UW

 

Service

Publicity Chair (ISS 2020)

Accessibility Co-chair (UIST 2020)

Associate Chair (CHI LBWs 2020)

Reviewer (CHI, UIST, MobileHCI, IMWUT, IEEE RO-MAN 2020)

Student Volunteer (UIST 2019)

Volunteer at Beijing Volunteer Service Foundation and the China Braille Library (2018)

 

Teaching

ECE 209AS Human-Computer Interaction, UCLA (2019 Fall)

 

Honors & Awards

Selected for a SIGCHI Student Travel Grant, 2020

Selected to CRA-WP Grad Cohort for Women, 2020

Graduates with distinction & Outstanding Thesis Award , Tsinghua University 2019

Best Paper Honorable Mention Award (Top 5%), CHI 2019

National Scholarship (Top 1%), Ministry of Education of the People’s Republic of China, 2018

Second Prize, Tsinghua University 35th Challenge Cup, 2018

Comprehensive Scholarship (Top 4%), Tsinghua University, 2017

First Prize, GIX Innovation Competition, 2016

Outstanding Thesis Award, Tianjin University, 2015

  

Invited Talks

"Inclusive Design: Accessibility Ignites Innovation" at TEDxTHU, 2018

 

Selected Press

Aliyun: 2020 Top 10 Charity Project

TechCrunch: Alibaba made a smart screen to help blind people shop and it costs next to nothing

The Next Web: Alibaba’s inexpensive smart display tech makes shopping easier for the visually impaired 

Techengage: Alibaba's Smart Touch is everything for the visually impaired