'What are you looking at?' An investigation into joint attention with a virtual human in VR: an EEG and eye-tracking study

Autor: Kelly, Cliona, Kessler, Klaus, Bernardet, Ulysses, meese, tim, Zumer, Johanna
Rok vydání: 2022
Předmět:
DOI: 10.17605/osf.io/6pmq3
Popis: Successfully coordinating attention allows an alignment of mental representations with others and in turn, aids the development of relationships and bonding. However, the behaviours that create successful joint attention is somewhat unknown. Here, we have developed a paradigm to investigate non-verbal behaviours (eye-contact and collaborative gaze sequences) that may provide insight into the details of the process. A computer-generated human, referred to as a virtual human (VH), is presented in an immersive virtual environment and is shown a puzzle piece that corresponds to either the participant’s or the VH’s puzzle board. The VH then performs a gaze sequence that would be either collaborative, and would direct the participants attention to the correct board, or non-collaborative and look elsewhere. Within these gaze sequences, the VH was either engaged in eye-contact, or not. Previous EEG research suggests that eye-contact invokes a higher desynchronization of alpha activity in comparison to no eye-contact in a left fronto-centro and central cluster. Additionally, the current paradigm involves direct eye-contact, averted gaze and direction of attention. Neuroimaging studies have reported theta signatures in the temporo-parietal junction as an area and frequency related to the processing of social scenarios, including perspective taking (Wang et al., 2016). In particular, studies using functional imaging have reported the superior temporal gyrus as a specific area of the TPJ that may be responsible for eye-gaze (Carter & Huettel, 2013). During presentation, EEG and eye-tracking is recorded in order to investigate the influence of these conditions on alpha and theta during the encoding and responding to these joint attention bids.
Databáze: OpenAIRE