Predicting choice behaviour in economic games using gaze data encoded as scanpath images.

Autor: Byrne SA; MoMiLab Research Unit, IMT School for Advanced Studies Lucca, Lucca, Italy., Reynolds APF; MoMiLab Research Unit, IMT School for Advanced Studies Lucca, Lucca, Italy., Biliotti C; AXES Research Unit, IMT School for Advanced Studies Lucca, Lucca, Italy., Bargagli-Stoffi FJ; Department of Biostatistics, Harvard University, Boston, USA., Polonio L; Department of Economics, Management and Statistics, University of Milano - Bicocca, Milan, Italy. luca.polonio@unimib.it., Riccaboni M; AXES Research Unit, IMT School for Advanced Studies Lucca, Lucca, Italy.
Jazyk: angličtina
Zdroj: Scientific reports [Sci Rep] 2023 Mar 23; Vol. 13 (1), pp. 4722. Date of Electronic Publication: 2023 Mar 23.
DOI: 10.1038/s41598-023-31536-5
Abstrakt: Eye movement data has been extensively utilized by researchers interested in studying decision-making within the strategic setting of economic games. In this paper, we demonstrate that both deep learning and support vector machine classification methods are able to accurately identify participants' decision strategies before they commit to action while playing games. Our approach focuses on creating scanpath images that best capture the dynamics of a participant's gaze behaviour in a way that is meaningful for predictions to the machine learning models. Our results demonstrate a higher classification accuracy by 18% points compared to a baseline logistic regression model, which is traditionally used to analyse gaze data recorded during economic games. In a broader context, we aim to illustrate the potential for eye-tracking data to create information asymmetries in strategic environments in favour of those who collect and process the data. These information asymmetries could become especially relevant as eye-tracking is expected to become more widespread in user applications, with the seemingly imminent mass adoption of virtual reality systems and the development of devices with the ability to record eye movement outside of a laboratory setting.
(© 2023. The Author(s).)
Databáze: MEDLINE
Nepřihlášeným uživatelům se plný text nezobrazuje