Enabling hand gesture customization on wrist-worn devices

Autor: Xu, Xuhai, Gong, Jun, Brum, Carolina, Liang, Lilian, Suh, Bongsoo, Gupta, Kumar, Agarwal, Yash, Lindsey, Laurence, Kang, Runchang, Shahsavari, Behrooz, Nguyen, Tu, Nieto, Heriberto, Hudson, Scott E., Maalouf, Charlie, Mousavi, Seyed, Laput, Gierad
Rok vydání: 2022
Předmět:
Druh dokumentu: Working Paper
DOI: 10.1145/3491102.3501904
Popis: We present a framework for gesture customization requiring minimal examples from users, all without degrading the performance of existing gesture sets. To achieve this, we first deployed a large-scale study (N=500+) to collect data and train an accelerometer-gyroscope recognition model with a cross-user accuracy of 95.7% and a false-positive rate of 0.6 per hour when tested on everyday non-gesture data. Next, we design a few-shot learning framework which derives a lightweight model from our pre-trained model, enabling knowledge transfer without performance degradation. We validate our approach through a user study (N=20) examining on-device customization from 12 new gestures, resulting in an average accuracy of 55.3%, 83.1%, and 87.2% on using one, three, or five shots when adding a new gesture, while maintaining the same recognition accuracy and false-positive rate from the pre-existing gesture set. We further evaluate the usability of our real-time implementation with a user experience study (N=20). Our results highlight the effectiveness, learnability, and usability of our customization framework. Our approach paves the way for a future where users are no longer bound to pre-existing gestures, freeing them to creatively introduce new gestures tailored to their preferences and abilities.
Comment: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems
Databáze: arXiv