EARTOUCH

Facilitating Smartphone Use for Visually Impaired People in Public and Mobile Scenarios

ABSTRACT

Interacting with a smartphone using touch input and speech output is challenging for visually impaired people in mobile and public scenarios, where only one hand may be available for input (e.g., while holding a cane) and using the loudspeaker for speech output is constrained by environmental noise, privacy, and social concerns. To address these issues, we propose EarTouch, a one-handed interaction technique that allows the users to interact with a smartphone using the ear to perform gestures on the touchscreen. Users hold the phone to their ears and listen to speech output from the ear speaker privately. We report how the technique was designed, implemented, and evaluated through a series of studies. Results show that EarTouch is easy, efficient, fun and socially acceptable to use.

FULL CITATION

Ruolin Wang, Chun Yu, Xing-Dong Yang, Weijie He, and Yuanchun Shi. 2019. EarTouch: Facilitating Smartphone Use for Visually Impaired People in Mobile and Public Scenarios. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). Association for Computing Machinery, New York, NY, USA, Paper 24, 1–13. [DOI]: https://doi.org/10.1145/3290605.3300254

EARTOUCH: FREQUENTLY ASKED QUESTIONS

  • What are the challenges to detect ear gestures?

The ear is soft with a complex shape. Therefore, its contact pattern with the touchscreen is complicated and easily deformed, as opposed to the contact pattern of a finger touch, this makes it hard for the phone to track the ear’s movement and gestural input. Additionally, the ergonomics of interaction are reversed: Eartouch gestures has to be performed by moving the device and not the input apparatus (ear). So the design of interaction paradigm cannot be generated by directly leveraging the knowledge that we have about finger-based touch.

EARTOUCH: STORIES BEHIND RESEARCH

...

📧Contact: violynne@ucla.edu

Latest News

April 16-18 2020: I will attend CRA-WP Grad Cohort Workshop for Women.

Feb 7 2020: One work accepted by CHI LBW.

Oct 25 2019: One work accepted by IMWUT.

Oct 22 2019: One work presented at UIST SIC.

Sep 20 2019: One paper submitted to CHI.

Aug 15 2019: One paper submitted to IMWUT.

July 30 2019: My first time attending SIGGRAPH.

 

Collaboration

UCLA Disabilities and Computing Program

Interconnected & Integrated Bioelectronics Lab @ Electrical and Computer Engineering, UCLA

NLP Group @ Computer Science, UCLA

Wearable Bioelectronics Research Group @ Bioengineering, UCLA

Laboratory for Clinical and Affective Psychophysiology @ Psychology, UCLA

 

Service

Accessibility Co-chair (UIST 2020)

Associate Chair (CHI LBWs 2020)

Peer Reviewer (CHI 2020, MobileHCI 2020)

Student Volunteer (UIST 2019)

Volunteer at Beijing Volunteer Service Foundation and the China Braille Library (2018)

 

Teaching

ECE 209AS Human-Computer Interaction, UCLA (2019 Fall)

 

Honors & Awards

Graduates with distinction & Outstanding Thesis Award , Tsinghua University 2019

Best Paper Honorable Mention Award (Top 5%), CHI 2019

National Scholarship (Top 1%), Ministry of Education of the People’s Republic of China, 2018

Second Prize, Tsinghua University 35th Challenge Cup, 2018

Comprehensive Scholarship (Top 4%), Tsinghua University, 2017

First Prize, GIX Innovation Competition, 2016

Outstanding Thesis Award, Tianjin University, 2015