Selective visual attention during public speaking in an immersive context
Autor: | Mikael Rubin, Sihang Guo, Karl Muller, Ruohan Zhang, Michael J. Telch, Mary M. Hayhoe |
---|---|
Rok vydání: | 2022 |
Předmět: | |
Zdroj: | Atten Percept Psychophys |
ISSN: | 1943-393X 1943-3921 |
DOI: | 10.3758/s13414-021-02430-x |
Popis: | It has recently become feasible to study selective visual attention to social cues in increasingly ecologically valid ways. In this secondary analysis, we examined gaze behavior in response to the actions of others in a social context. Participants (N = 84) were asked to give a 5-minute speech to a five-member audience that had been filmed in 360° video, displayed in a virtual reality headset containing a built-in eye tracker. Audience members were coached to make movements that would indicate interest or lack of interest (e.g., nodding vs. looking away). The goal of this paper was to analyze whether these actions influenced the speaker's gaze. We found that participants showed reliable evidence of gaze towards audience member actions in general, and towards audience member actions involving their phone specifically (compared with other actions like looking away or leaning back). However, there were no differences in gaze towards actions reflecting interest (like nodding) compared with actions reflecting lack of interest (like looking away). Participants were more likely to look away from audience member actions as well, but there were no specific actions that elicited looking away more or less. Taken together, these findings suggest that the actions of audience members are broadly influential in motivating gaze behaviors in a realistic, contextually embedded (public speaking) setting. Further research is needed to examine the ways in which these findings can be elucidated in more controlled laboratory environments as well as in the real world. |
Databáze: | OpenAIRE |
Externí odkaz: |