Using event-related brain potentials to evaluate motor-auditory latencies in virtual reality.

Autor: Feder S; Cognitive Systems Lab, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany., Miksch J; Cognitive Systems Lab, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany.; Physics of Cognition Group, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany., Grimm S; Cognitive Systems Lab, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany.; Physics of Cognition Group, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany., Krems JF; Research Group Cognitive and Engineering Psychology, Institute of Psychology, Faculty of Behavioural and Social Sciences, Chemnitz University of Technology, Chemnitz, Germany., Bendixen A; Cognitive Systems Lab, Institute of Physics, Faculty of Natural Sciences, Chemnitz University of Technology, Chemnitz, Germany.
Jazyk: angličtina
Zdroj: Frontiers in neuroergonomics [Front Neuroergon] 2023 Jul 05; Vol. 4, pp. 1196507. Date of Electronic Publication: 2023 Jul 05 (Print Publication: 2023).
DOI: 10.3389/fnrgo.2023.1196507
Abstrakt: Actions in the real world have immediate sensory consequences. Mimicking these in digital environments is within reach, but technical constraints usually impose a certain latency (delay) between user actions and system responses. It is important to assess the impact of this latency on the users, ideally with measurement techniques that do not interfere with their digital experience. One such unobtrusive technique is electroencephalography (EEG), which can capture the users' brain activity associated with motor responses and sensory events by extracting event-related potentials (ERPs) from the continuous EEG recording. Here we exploit the fact that the amplitude of sensory ERP components (specifically, N1 and P2) reflects the degree to which the sensory event was perceived as an expected consequence of an own action (self-generation effect). Participants ( N = 24) elicit auditory events in a virtual-reality (VR) setting by entering codes on virtual keypads to open doors. In a within-participant design, the delay between user input and sound presentation is manipulated across blocks. Occasionally, the virtual keypad is operated by a simulated robot instead, yielding a control condition with externally generated sounds. Results show that N1 (but not P2) amplitude is reduced for self-generated relative to externally generated sounds, and P2 (but not N1) amplitude is modulated by delay of sound presentation in a graded manner. This dissociation between N1 and P2 effects maps back to basic research on self-generation of sounds. We suggest P2 amplitude as a candidate read-out to assess the quality and immersiveness of digital environments with respect to system latency.
Competing Interests: The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. AB declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.
(Copyright © 2023 Feder, Miksch, Grimm, Krems and Bendixen.)
Databáze: MEDLINE