Gaze-action coupling, gaze-gesture coupling, and exogenous attraction of gaze in dyadic interactions.
Autor: | Hessels RS; Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584CS, Utrecht, Netherlands. r.s.hessels@uu.nl., Li P; Information and Computing Sciences, Utrecht University, Utrecht, Netherlands., Balali S; Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584CS, Utrecht, Netherlands., Teunisse MK; Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584CS, Utrecht, Netherlands., Poppe R; Information and Computing Sciences, Utrecht University, Utrecht, Netherlands., Niehorster DC; Lund University Humanities Lab, Lund University, Lund, Sweden.; Department of Psychology, Lund University, Lund, Sweden., Nyström M; Lund University Humanities Lab, Lund University, Lund, Sweden., Benjamins JS; Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584CS, Utrecht, Netherlands.; Social, Health and Organisational Psychology, Utrecht University, Utrecht, Netherlands., Senju A; Research Center for Child Mental Development, Hamamatsu University School of Medicine, Hamamatsu, Japan., Salah AA; Information and Computing Sciences, Utrecht University, Utrecht, Netherlands., Hooge ITC; Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584CS, Utrecht, Netherlands. |
---|---|
Jazyk: | angličtina |
Zdroj: | Attention, perception & psychophysics [Atten Percept Psychophys] 2024 Nov 18. Date of Electronic Publication: 2024 Nov 18. |
DOI: | 10.3758/s13414-024-02978-4 |
Abstrakt: | In human interactions, gaze may be used to acquire information for goal-directed actions, to acquire information related to the interacting partner's actions, and in the context of multimodal communication. At present, there are no models of gaze behavior in the context of vision that adequately incorporate these three components. In this study, we aimed to uncover and quantify patterns of within-person gaze-action coupling, gaze-gesture and gaze-speech coupling, and coupling between one person's gaze and another person's manual actions, gestures, or speech (or exogenous attraction of gaze) during dyadic collaboration. We showed that in the context of a collaborative Lego Duplo-model copying task, within-person gaze-action coupling is strongest, followed by within-person gaze-gesture coupling, and coupling between gaze and another person's actions. When trying to infer gaze location from one's own manual actions, gestures, or speech or that of the other person, only one's own manual actions were found to lead to better inference compared to a baseline model. The improvement in inferring gaze location was limited, contrary to what might be expected based on previous research. We suggest that inferring gaze location may be most effective for constrained tasks in which different manual actions follow in a quick sequence, while gaze-gesture and gaze-speech coupling may be stronger in unconstrained conversational settings or when the collaboration requires more negotiation. Our findings may serve as an empirical foundation for future theory and model development, and may further be relevant in the context of action/intention prediction for (social) robotics and effective human-robot interaction. Competing Interests: Declarations Competing Interests The authors report that there are no competing interests to declare. Ethics Approval This study was performed in line with the principles of the Declaration of Helsinki. The study was approved by the Ethics Committee of the Faculty of Social and Behavioural Sciences of Utrecht University (protocol number 22-0206) Consent to Participate All participants gave written informed consent. Consent for Publication The authors affirm that human research participants provided informed consent for publication of the videos available at https://osf.io/2q6f4/. Open Practices Statement Data files and example videos are available at https://osf.io/2q6f4/. The experiment and analyses were not preregistered. (© 2024. The Author(s).) |
Databáze: | MEDLINE |
Externí odkaz: |