JOURNAL ARTICLE

A Multimodal Sentiment Analysis Method Based on Interactive Attention

Abstract

Sentiment analysis, a crucial aspect of human-computer interactions and healthcare, faces challenges in multi-modal data representation and fusion. Traditional methods often fail to adequately capture the complexities of integrating diverse modalities. Addressing this, we introduce the Cross Attention Fusion (CAF) algorithm, a novel approach for effective multi-modal sentiment analysis. The CAF algorithm uniquely employs a dual attention mechanism—global and local—to enhance the representation and integration of features across modalities. Our extensive experiments, conducted on the CH-SIMS dataset, demonstrate that the CAF algorithm significantly outperforms existing unimodal and multimodal methods in accuracy. These results highlight the CAF algorithm's superiority in capturing nuanced sentiment expressions, marking a notable advancement in the field of sentiment analysis.

Keywords:
Computer science Sentiment analysis Human–computer interaction Artificial intelligence

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
18
Refs
0.20
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Sentiment Analysis and Opinion Mining
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Video Analysis and Summarization
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.