Using large language models to accelerate communication for eye gaze typing users with ALS.

Autor: Cai, Shanqing, Venugopalan, Subhashini, Seaver, Katie, Xiao, Xiang, Tomanek, Katrin, Jalasutram, Sri, Morris, Meredith Ringel, Kane, Shaun, Narayanan, Ajit, MacDonald, Robert L., Kornman, Emily, Vance, Daniel, Casey, Blair, Gleason, Steve M., Nelson, Philip Q., Brenner, Michael P.
Předmět:
Zdroj: Nature Communications; 11/1/2024, Vol. 15 Issue 1, p1-18, 18p
Abstrakt: Accelerating text input in augmentative and alternative communication (AAC) is a long-standing area of research with bearings on the quality of life in individuals with profound motor impairments. Recent advances in large language models (LLMs) pose opportunities for re-thinking strategies for enhanced text entry in AAC. In this paper, we present SpeakFaster, consisting of an LLM-powered user interface for text entry in a highly-abbreviated form, saving 57% more motor actions than traditional predictive keyboards in offline simulation. A pilot study on a mobile device with 19 non-AAC participants demonstrated motor savings in line with simulation and relatively small changes in typing speed. Lab and field testing on two eye-gaze AAC users with amyotrophic lateral sclerosis demonstrated text-entry rates 29–60% above baselines, due to significant saving of expensive keystrokes based on LLM predictions. These findings form a foundation for further exploration of LLM-assisted text entry in AAC and other user interfaces. Individuals with severe motor impairments use gaze to type and communicate. This paper presents a large language model-based user interface that enables gaze typing in highly abbreviated forms, achieving significant motor saving and speed gain. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index