Look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research.
Autor: | Strobl MAR; Wolfson Centre for Mathematical Biology, Mathematical Institute, University of Oxford, Radcliffe Observatory Quarter, OX2 6GG, Oxford, UK. maximilian.strobl@gmail.com.; Department of Integrated Mathematical Oncology, Moffitt Cancer Center, Magnolia Drive, 12902, Tampa, USA. maximilian.strobl@gmail.com., Lipsmeier F; Roche Pharma Research and Early Development, pRED Informatics, Roche Innovation Center, F. Hoffmann-La Roche Ltd, Basel, Switzerland., Demenescu LR; Roche Pharma Research and Early Development, pRED Informatics, Roche Innovation Center, F. Hoffmann-La Roche Ltd, Basel, Switzerland., Gossens C; Roche Pharma Research and Early Development, pRED Informatics, Roche Innovation Center, F. Hoffmann-La Roche Ltd, Basel, Switzerland., Lindemann M; Roche Pharma Research and Early Development, pRED Informatics, Roche Innovation Center, F. Hoffmann-La Roche Ltd, Basel, Switzerland., De Vos M; Department of Engineering Science, Institute of Biomedical Engineering, University of Oxford, Old Road Campus Research Building, OX3 7DQ, Oxford, UK. |
---|---|
Jazyk: | angličtina |
Zdroj: | Biomedical engineering online [Biomed Eng Online] 2019 May 03; Vol. 18 (1), pp. 51. Date of Electronic Publication: 2019 May 03. |
DOI: | 10.1186/s12938-019-0670-1 |
Abstrakt: | Background: Avoidance to look others in the eye is a characteristic symptom of Autism Spectrum Disorders (ASD), and it has been hypothesised that quantitative monitoring of gaze patterns could be useful to objectively evaluate treatments. However, tools to measure gaze behaviour on a regular basis at a manageable cost are missing. In this paper, we investigated whether a smartphone-based tool could address this problem. Specifically, we assessed the accuracy with which the phone-based, state-of-the-art eye-tracking algorithm iTracker can distinguish between gaze towards the eyes and the mouth of a face displayed on the smartphone screen. This might allow mobile, longitudinal monitoring of gaze aversion behaviour in ASD patients in the future. Results: We simulated a smartphone application in which subjects were shown an image on the screen and their gaze was analysed using iTracker. We evaluated the accuracy of our set-up across three tasks in a cohort of 17 healthy volunteers. In the first two tasks, subjects were shown different-sized images of a face and asked to alternate their gaze focus between the eyes and the mouth. In the last task, participants were asked to trace out a circle on the screen with their eyes. We confirm that iTracker can recapitulate the true gaze patterns, and capture relative position of gaze correctly, even on a different phone system to what it was trained on. Subject-specific bias can be corrected using an error model informed from the calibration data. We compare two calibration methods and observe that a linear model performs better than a previously proposed support vector regression-based method. Conclusions: Under controlled conditions it is possible to reliably distinguish between gaze towards the eyes and the mouth with a smartphone-based set-up. However, future research will be required to improve the robustness of the system to roll angle of the phone and distance between the user and the screen to allow deployment in a home setting. We conclude that a smartphone-based gaze-monitoring tool provides promising opportunities for more quantitative monitoring of ASD. |
Databáze: | MEDLINE |
Externí odkaz: | |
Nepřihlášeným uživatelům se plný text nezobrazuje | K zobrazení výsledku je třeba se přihlásit. |