I am Ruolin Wang, a 1st year Ph.D. student at the Human-Computer Interaction Research Group at University of California, Los Angeles, advised by Prof. Xiang 'Anthony' Chen. Prior to joining UCLA, I worked at Tsinghua HCI lab, advised by Prof. Yuanchun Shi and Chun Yu .
I am interested in augmenting the abilities of both digital devices and humankind to provide seamless experience and improve wellbeing (e.g., accessibility, health, emotion regulation, cognitive enhancement). By creating novel interaction techniques and deploying elegant solutions for Real-World users, I pursue intellectual impact as well as practical values. My career blueprint is to work at the intersection of HCI, AI, neuroscience and biosensing.
Tap-to-Pair: Associating Wireless Devices using Synchronous Tapping
Tengxiang Zhang, Xin Yi, Ruolin Wang, Yuntao Wang, Chun Yu, Yiqin Lu, Yuanchun Shi ( IMWUT 2018 )
Currently, most wireless devices are associated by selecting the advertiser’s name from a list, which becomes less efficient and often misplaced. We propose an spontaneous device association mechanism that initiates pairing from advertising devices without hardware or firmware modifications. Users can then associate two devices by synchronizing taps on the advertising device with the blinking pattern displayed by the scanning device. We believe that Tap-to-Pair can unlock more possibilities for impromptu interactions in smart spaces.
EarTouch: Facilitating Smartphone Use for Visually Impaired People in Public and Mobile Scenarios
Ruolin Wang, Chun Yu, Xing-Dong Yang, Weijie He, Yuanchun Shi ( CHI 2019, Best Paper Honorable Mention 🏅)
Interacting with a smartphone using touch and speech output is challenging for visually impaired people in public and mobile scenarios, where only one hand may be available for input (e.g. with the other one holding a cane) and privacy may not be guaranteed when playing speech output using the speakers. We propose EarTouch, an one-handed interaction technique that allows the users to interact with a smartphone touching the ear on the screen. Users hold the smartphone in a talking position and listen to speech output from ear speaker privately. EarTouch also brings us an step closer to the inclusive design for all users who may suffer from situational disabilities.
Prevent Unintentional Touches on Smartphone
with Xin Yi, Weijie He, Zhican Yang, Lihang Pan, Chun Yu, Yuanchun Shi ( 2016- 2017 Project )
When interacting with smartphones, the holding hand may cause unintentional touches on the screen and disturb the interactions, which is annoying for users. We develop capacitive image processing algorithm to identify the patterns of unintentional touches, such as the process of touches "growing up" and the corresponding relationships among touches on the screen etc. Through mechanical and artificial test, our algorithm rejected 96.33% of unintentional touches while rejecting 1.32% of intentional touches.
SmartTouch: Providing Intelligent Multi-modal Interface for Visually Impaired Smartphone Users
with Lihang Pan, Xinghui Yan, Guanhong Liu, Chun Yu ( 2017-2018 Project )
The closest way to natural user interface may be building intelligent proxy which can support multi-modal interactions according to end-users' intentions. We take our first step towards improving the user experience of visually impaired smartphone users. Based on interviews and participatory design activities, we try to explore the proper roles of graphic/haptic/voice UI in this case and establish guidelines for designing multi-model user interface. The intelligent proxy serves as the control center for building the bridge crossing "The Gulf of Execution and Evaluation" and it automatically performed the tasks when it understands the users' intentions.
BrainQuake: Auxiliary diagnosis for epilepsy
with Liang Xiang, Tong Zhao, Kang Wang, YanQin Lei, Bo Hong ( 2018 Project )
An sEEG intelligent cloud processing system designed to provide a more effective means for epilepsy surgery planning and research on the pathogenesis of epilepsy.
Intelligent Infusion System
with Jialiang Yu, Mi Xiao ( 2014 -2015 Project)
Intravenous infusion is an important part of nursing work and one of the most commonly used medical treatments in clinical treatment. For a long time, most hospitals and medical institutions have relied on manual operations. Our solution includes infusion controllers and management system. Based on infrared sensor and stepper motor, infusion controllers can detect and control the infusion speed. The management system communicate the infusion information (patient, drug, infusion speed, time etc.) with the controllers through wireless.