JOURNAL ARTICLE

Dropout Multi-Head Attention for Single Image Super-Resolution

Abstract

Single Image Super-Resolution (SISR) has been a foundational task in low-level vision for a long time. Recently, architectures based on Transformer have demonstrated outstanding performance in SISR tasks. However, the results of the attribution analysis indicate that transformer-based networks tend to underutilize surrounding pixels compared to other algorithms. We propose a novel architecture, the dropout multi-head attention transformer (DMAT), to use more input pixels for super-resolution. The DMAT enhances attention mechanisms by selectively obscuring key segments of windowed multi-head self-attention during the training . The approach ensures a more uniform attention distribution to pixels for super-resolution. Furthermore, to optimize multi-head attention learning and to integrate diverse attentions, we propose the head attention module (HAM) in DMAT to learn weights for individual attention head. The experimental results validate that our model outperforms prevailing state-of-the-art approaches across diverse test sets, especially in terms of contour structure and texture details.

Keywords:
Dropout (neural networks) Computer science Head (geology) Artificial intelligence Computer vision Resolution (logic) Machine learning Geology

Metrics

1
Cited By
0.53
FWCI (Field Weighted Citation Impact)
20
Refs
0.49
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Image Processing Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Image Processing Techniques and Applications
Physical Sciences →  Engineering →  Media Technology
CCD and CMOS Imaging Sensors
Physical Sciences →  Engineering →  Electrical and Electronic Engineering

Related Documents

JOURNAL ARTICLE

SSIR: Spatial shuffle multi-head self-attention for Single Image Super-Resolution

Liangliang ZhaoJunyu GaoDeng Dong-huXuelong Li

Journal:   Pattern Recognition Year: 2023 Vol: 148 Pages: 110195-110195
JOURNAL ARTICLE

Multi-attention augmented network for single image super-resolution

Rui ChenHeng ZhangJixin Liu

Journal:   Pattern Recognition Year: 2021 Vol: 122 Pages: 108349-108349
JOURNAL ARTICLE

Multi-attention fusion transformer for single-image super-resolution

Guanxing LiZhaotong CuiMeng LiYu HanTianping Li

Journal:   Scientific Reports Year: 2024 Vol: 14 (1)Pages: 10222-10222
JOURNAL ARTICLE

Multi-Grained Attention Networks for Single Image Super-Resolution

Huapeng WuZhengxia ZouJie GuiWenjun ZengJieping YeJun ZhangHongyi LiuZhihui Wei

Journal:   IEEE Transactions on Circuits and Systems for Video Technology Year: 2020 Vol: 31 (2)Pages: 512-522
© 2026 ScienceGate Book Chapters — All rights reserved.