Hi! I am Ruolin Wang (Violynne), a Research Assistant and Ph.D. student in the Human-Computer Interaction Research Group at University of California, Los Angeles, advised by Prof. Xiang 'Anthony' Chen. Prior to joining UCLA, I worked in the Tsinghua Pervasive HCI Group . I received my M.Sc degree in Computer Science from Tsinghua University and BEng degree in Microelectronics from Tianjin University.
My career blueprint as an Interdisciplinary Researcher is augmenting the abilities of both digital devices and humankind to provide seamless experience and improve wellbeing. I pursue intellectual impact as well as practical values.
Ruolin Wang, Chun Yu, Xing-Dong Yang, Weijie He, Yuanchun Shi ( CHI 2019, Best Paper Honorable Mention 🏅)
Interacting with a smartphone using touch and speech output is challenging for visually impaired people in public and mobile scenarios, where only one hand may be available for input (e.g. with the other one holding a cane) and privacy may not be guaranteed when playing speech output using the speakers. EarTouch is a one-handed interaction technique that allows the users to interact with a smartphone by touching the ear on the screen. Users hold the smartphone in a talking position and listen to speech output from ear speaker privately. Eight ear gestures were deployed to support seven common tasks including answering a phone call, sending a message, map navigation, etc. EarTouch also brings us a step closer to the inclusive design for all users who may suffer from situational disabilities.