JOURNAL ARTICLE

A Simplified Query-Only Attention for Encoder-Based Transformer Models

Hong Gi YeomKyung‐min An

Year: 2024 Journal:   Applied Sciences Vol: 14 (19)Pages: 8646-8646   Publisher: Multidisciplinary Digital Publishing Institute

Abstract

Transformer models have revolutionized fields like Natural Language Processing (NLP) by enabling machines to accurately understand and generate human language. However, these models’ inherent complexity and limited interpretability pose barriers to their broader adoption. To address these challenges, we propose a simplified query-only attention mechanism specifically for encoder-based transformer models to reduce complexity and improve interpretability. Unlike conventional attention mechanisms, which rely on query (Q), key (K), and value (V) vectors, our method uses only the Q vector for attention calculation. This approach reduces computational complexity while maintaining the model’s ability to capture essential relationships, enhancing interpretability. We evaluated the proposed query-only attention on an EEG conformer model, a state-of-the-art architecture for EEG signal classification. We demonstrated that it performs comparably to the original QKV attention mechanism, while simplifying the model’s architecture. Our findings suggest that query-only attention offers a promising direction for the development of more efficient and interpretable transformer-based models, with potential applications across various domains beyond NLP.

Keywords:
Computer science Transformer Electrical engineering Engineering Voltage

Metrics

1
Cited By
0.64
FWCI (Field Weighted Citation Impact)
41
Refs
0.69
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Neural Networks and Reservoir Computing
Physical Sciences →  Computer Science →  Artificial Intelligence
Reinforcement Learning in Robotics
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.