JOURNAL ARTICLE

A Multi-Head Convolution Network with Attention Consistency for Facial Expression Recognition

Abstract

In recent years, the demand for facial expression recognition applications has increased rapidly, and its research has received extensive attention from researchers. However, the current recognition methods based on deep learning ignore the idea of multiple head attention and semantic consistency, resulting in the model only paying attention to the local area of the feature map. In addition, the model's inconsistent attention to the images before and after the flip, resulting the poor robustness, poor interpretation, and other shortcomings in the model. To address the above problems, we propose an Affinity Separation Loss (ASLoss), which improves the separability of samples through clustering. Moreover, a Separate Multi-head Attention block (SMA), and a Zonal Loss (ZLoss) are also designed to decentralize the model's attention. Experimental results demonstrate that our proposed MACNet method achieves competitive recognition performance on two public datasets RAF-DB and FERPlus.

Keywords:
Computer science Robustness (evolution) Consistency (knowledge bases) Cluster analysis Artificial intelligence Convolution (computer science) Facial expression recognition Pattern recognition (psychology) Feature (linguistics) Expression (computer science) Feature extraction Block (permutation group theory) Speech recognition Perception Machine learning Facial recognition system Artificial neural network Mathematics

Metrics

2
Cited By
0.83
FWCI (Field Weighted Citation Impact)
37
Refs
0.69
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Emotion and Mood Recognition
Social Sciences →  Psychology →  Experimental and Cognitive Psychology
Face and Expression Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Computing and Algorithms
Social Sciences →  Social Sciences →  Urban Studies
© 2026 ScienceGate Book Chapters — All rights reserved.