Hi! I am Ruolin Wang (Violynne), a Research Assistant and Ph.D. student in the Human-Computer Interaction Research Group at University of California, Los Angeles, advised by Prof. Xiang 'Anthony' Chen. Prior to joining UCLA, I worked in the Tsinghua Pervasive HCI Group through the Interdisciplinary Master Program across School of Information Science and Technology, Academy of Arts and Design and School of News and Journalism. I received M.Sc degree in Computer Science from Tsinghua University and BEng degree in Microelectronics from Tianjin University.
My mission is to create seamless experience and improve wellbeing toward real-world users through Interdisciplinary and Inclusive Research.
☕ My Study Call for Participants! Using a writing and notating platform developed by to conduct 10-mins anonymous 3 times within a week! Participants will be compensated with $20 per hour via an electronic Amazon gift card. Please if you are interested. Check this page or contact for more details.
Ruolin Wang, Chun Yu, Xing-Dong Yang, Weijie He, Yuanchun Shi ( CHI 2019, Best Paper Honorable Mention 🏅)
Interacting with a smartphone using touch and speech output is challenging for visually impaired people in public and mobile scenarios, where only one hand may be available for input (e.g. with the other one holding a cane) and privacy may not be guaranteed when playing speech output using the speakers. EarTouch is a one-handed interaction technique that allows the users to interact with a smartphone by touching the ear on the screen. Users hold the smartphone in a talking position and listen to speech output from ear speaker privately. Eight ear gestures were deployed to support seven common tasks including answering a phone call, sending a message, map navigation, etc. EarTouch also brings us a step closer to the inclusive design for all users who may suffer from situational disabilities.