Abstract

This work suggests utilising voice and electrocardiogram (ECG) inputs to train a deep learning-based RNN system to recognise emotions. The voice and ECG signals are fed into a recurrent neural network (RNN), which extracts the characteristics from the signals. For emotion classification, the collected features are subsequently processed into a fully linked layer. On a dataset of speech and ECG recordings taken from people during emotional elicitation tasks, the suggested method is assessed and found to perform very accurate emotion detection when compared to conventional approaches. The experiment indicate that the proposed approach is effective in recognizing emotions using both speech and ECG signals, highlighting the potential of this approach for real-world applications.

Keywords:
Computer science Emotion recognition Speech recognition Recurrent neural network Artificial intelligence Modal Deep learning Artificial neural network Feature extraction Machine learning Pattern recognition (psychology)

Metrics

1
Cited By
0.42
FWCI (Field Weighted Citation Impact)
7
Refs
0.63
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Emotion and Mood Recognition
Social Sciences →  Psychology →  Experimental and Cognitive Psychology
EEG and Brain-Computer Interfaces
Life Sciences →  Neuroscience →  Cognitive Neuroscience
Speech and Audio Processing
Physical Sciences →  Computer Science →  Signal Processing

Related Documents

JOURNAL ARTICLE

Towards Efficient Multi-Modal Emotion Recognition

Simon DobrišekRok GajšekFrance MiheličNikola PavešićVitomir Štruc

Journal:   International Journal of Advanced Robotic Systems Year: 2013 Vol: 10 (1)
JOURNAL ARTICLE

Cross-modal dynamic convolution for multi-modal emotion recognition

Huanglu WenShaodi YouYing Fu

Journal:   Journal of Visual Communication and Image Representation Year: 2021 Vol: 78 Pages: 103178-103178
JOURNAL ARTICLE

Dynamic Confidence-Aware Multi-Modal Emotion Recognition

Qi ZhuChuhang ZhengZheng ZhangWei ShaoDaoqiang Zhang

Journal:   IEEE Transactions on Affective Computing Year: 2023 Vol: 15 (3)Pages: 1358-1370
© 2026 ScienceGate Book Chapters — All rights reserved.