Automatic subject-specific spatiotemporal feature selection for subject-independent affective BCI

Autor: Chun-Hsi Huang, Badar Almarri, Sanguthevar Rajasekaran
Jazyk: angličtina
Rok vydání: 2021
Předmět:
Support Vector Machine
Computer science
Physiology
Entropy
Emotions
Social Sciences
02 engineering and technology
Machine Learning
0302 clinical medicine
Learning and Memory
Sociology
0202 electrical engineering
electronic engineering
information engineering

Medicine and Health Sciences
Preprocessor
Psychology
Clinical Neurophysiology
Brain Mapping
Multidisciplinary
Physics
Applied Mathematics
Simulation and Modeling
Software Engineering
Social Communication
Electroencephalography
Electrophysiology
Bioassays and Physiological Analysis
Brain Electrophysiology
Brain-Computer Interfaces
Physical Sciences
Medicine
Engineering and Technology
Thermodynamics
Algorithms
Curse of dimensionality
Research Article
Computer and Information Sciences
Imaging Techniques
Feature vector
Science
Neurophysiology
Feature selection
Neuroimaging
Research and Analysis Methods
03 medical and health sciences
Artificial Intelligence
020204 information systems
Support Vector Machines
Learning
Humans
Preprocessing
Electrodes
Selection (genetic algorithm)
Brain–computer interface
business.industry
Electrophysiological Techniques
Cognitive Psychology
Biology and Life Sciences
Pattern recognition
Pipeline (software)
Communications
Support vector machine
Cognitive Science
Artificial intelligence
Clinical Medicine
business
030217 neurology & neurosurgery
Mathematics
Neuroscience
Zdroj: PLoS ONE
PLoS ONE, Vol 16, Iss 8, p e0253383 (2021)
ISSN: 1932-6203
Popis: The dimensionality of the spatially distributed channels and the temporal resolution of electroencephalogram (EEG) based brain-computer interfaces (BCI) undermine emotion recognition models. Thus, prior to modeling such data, as the final stage of the learning pipeline, adequate preprocessing, transforming, and extracting temporal (i.e., time-series signals) and spatial (i.e., electrode channels) features are essential phases to recognize underlying human emotions. Conventionally, inter-subject variations are dealt with by avoiding the sources of variation (e.g., outliers) or turning the problem into a subject-deponent. We address this issue by preserving and learning from individual particularities in response to affective stimuli. This paper investigates and proposes a subject-independent emotion recognition framework that mitigates the subject-to-subject variability in such systems. Using an unsupervised feature selection algorithm, we reduce the feature space that is extracted from time-series signals. For the spatial features, we propose a subject-specific unsupervised learning algorithm that learns from inter-channel co-activation online. We tested this framework on real EEG benchmarks, namely DEAP, MAHNOB-HCI, and DREAMER. We train and test the selection outcomes using nested cross-validation and a support vector machine (SVM). We compared our results with the state-of-the-art subject-independent algorithms. Our results show an enhanced performance by accurately classifying human affection (i.e., based on valence and arousal) by 16%–27% compared to other studies. This work not only outperforms other subject-independent studies reported in the literature but also proposes an online analysis solution to affection recognition.
Databáze: OpenAIRE
Nepřihlášeným uživatelům se plný text nezobrazuje