Subject independent emotion recognition from EEG using VMD and deep learning
Autor: | K. R. Seeja, Pallavi Pandey |
---|---|
Rok vydání: | 2022 |
Předmět: |
Facial expression
General Computer Science medicine.diagnostic_test Artificial neural network Computer science business.industry Speech recognition Deep learning Feature extraction 020206 networking & telecommunications 02 engineering and technology Electroencephalography Eeg patterns 0202 electrical engineering electronic engineering information engineering medicine 020201 artificial intelligence & image processing Emotion recognition Artificial intelligence business Classifier (UML) |
Zdroj: | Journal of King Saud University - Computer and Information Sciences. 34:1730-1738 |
ISSN: | 1319-1578 |
DOI: | 10.1016/j.jksuci.2019.11.003 |
Popis: | Emotion recognition from Electroencephalography (EEG) is proved to be a good choice as it cannot be mimicked like speech signals or facial expressions. EEG signals of emotions are not unique and it varies from person to person as each one has different emotional responses to the same stimuli. Thus EEG signals are subject dependent and proved to be effective for subject dependent emotion recognition. However, subject independent emotion recognition plays an important role in situations like emotion recognition from paralyzed or burnt face, where EEG of emotions of the subjects before the incidents are not available to build the emotion recognition model. Hence there is a need to identify common EEG patterns corresponds to each emotion independent of the subjects. In this paper, a subject independent emotion recognition technique is proposed from EEG signals using Variational Mode Decomposition (VMD) as a feature extraction technique and Deep Neural Network as the classifier. The performance evaluation of the proposed method with the benchmark DEAP dataset shows that the combination of VMD and Deep Neural Network performs better compared to the state of the art techniques in subject-independent emotion recognition from EEG. |
Databáze: | OpenAIRE |
Externí odkaz: |