1. Home /
  2. Science /
  3. Human-Computer Interaction UC Merced

Category



General Information

Locality: Merced, California

Phone: +1 209-228-3639



Address: 5200 N. Lake Road 95343 Merced, CA, US

Website: www.asarif.com

Likes: 207

Reviews

Add review

Facebook Blog





Human-Computer Interaction UC Merced 03.05.2021

Thank you ACM #TechNews for covering our work in this week's edition. https://technews.acm.org/archives.cfm #LipType

Human-Computer Interaction UC Merced 25.04.2021

A news article on our #CHI2021 papers presenting an improved digital lip reader combined with an independent silent speech and speech recognition repair model, and a social study on the acceptability of lip reader as an input method. #accessibility #textentry

Human-Computer Interaction UC Merced 22.04.2021

The book will be used in the newly introduced course EECS 289: Topics in Human-Computer Interaction.

Human-Computer Interaction UC Merced 06.04.2021

Evaluating multi-step input methods with traditional performance metrics is difficult since they fail to capture the actual performance of a multi-step constructive or chorded method and do not provide enough detail to aid in design improvements. We developed three new action-level metrics that account for the error rate, accuracy, and complexity of multi-step methods by observing the actions performed to type one character. This work is a collaboration between us and Steven Castellucci, Scott MacKenzie (York University), and Caitlyn Seim (Stanford University). Ph.D. student Gulnar Rakhmetulla will present this work at the 47th Graphics Interface Conference (#GI2021).

Human-Computer Interaction UC Merced 25.03.2021

Most input methods for smartwatches require repeated actions, occupy most of the screen, or use difficult to learn layouts. Besides, the typing skill acquired on one device usually does not transfer to other devices. We developed #SwipeRing, a new ring-shaped segmented keyboard that resembles Qwerty and optimized for transferring gesture typing skill between devices. See how the gestures drawn on different devices and layouts are similar! Users reaches a competitive 17 words per minute on a small smartwatch with a little practice. Ph.D. student Gulnar Rakhmetulla will present this work at the 47th Graphics Interface Conference (#GI2021).