JOURNAL ARTICLE

EEG Emotion Recognition Via Ensemble Learning Representations

Abstract

Electroencephalography (EEG) based emotion recognition is gaining substantial interest because of its strong association with the area of brain-computer interface. Even though several works exist in the literature, it is still challenging to find discriminative features that can generalize well to different EEG datasets. In this work, we focus on developing a deep learning model that makes use of the spatial and temporal representations of the EEG signal to generate EEG embeddings for emotion recognition. The proposed model uses a self-attention mechanism along with a feature fusion approach to improve the discrimination power of the learned EEG embeddings. Comprehensive experiments are conducted on the DEAP dataset, which demonstrates the superiority of the proposed work, where the attained accuracies for the arousal and valence classification are 91.17% and 90.73% , respectively.

Keywords:
Electroencephalography Discriminative model Computer science Artificial intelligence Emotion recognition Arousal Focus (optics) Pattern recognition (psychology) Valence (chemistry) Speech recognition Feature extraction Emotion classification Feature (linguistics) Affective computing Psychology Neuroscience

Metrics

4
Cited By
1.06
FWCI (Field Weighted Citation Impact)
30
Refs
0.68
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

EEG and Brain-Computer Interfaces
Life Sciences →  Neuroscience →  Cognitive Neuroscience
Emotion and Mood Recognition
Social Sciences →  Psychology →  Experimental and Cognitive Psychology
Gaze Tracking and Assistive Technology
Physical Sciences →  Computer Science →  Human-Computer Interaction
© 2026 ScienceGate Book Chapters — All rights reserved.