JOURNAL ARTICLE

Low-Light Image Enhancement Combined with Attention Map and U-Net Network

Abstract

The quality of images obtained in low-light environments will degrade and exhibit poor results, so researchers have focused on improving low-light images and enhancing them. This paper recommends a different method for low-illumination image enhancement. Our method combines traditional methods and CNN to achieve image brightness enhancement, color recovery and denoising. The attention map that we propose can help avoid overexposure in bright regions and enhance dark regions. Through a large number of subjective and objective experiments, the method proposed in this paper can not only enhance low-illuminance images, but also require less inference time.

Keywords:
Artificial intelligence Brightness Computer science Computer vision Illuminance Image enhancement Image (mathematics) Image quality Noise reduction Optics

Metrics

7
Cited By
0.21
FWCI (Field Weighted Citation Impact)
16
Refs
0.51
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Image Enhancement Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Image Processing Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

BOOK-CHAPTER

Low-Light Image Enhancement Combining U-Net and Self-attention Mechanism

Li MaQian Wang

Lecture notes on data engineering and communications technologies Year: 2022 Pages: 769-780
JOURNAL ARTICLE

Extreme Low-Light Image Enhancement for Surveillance Cameras Using Attention U-Net

Sophy AiJang-Woo Kwon

Journal:   Sensors Year: 2020 Vol: 20 (2)Pages: 495-495
JOURNAL ARTICLE

Multi-Scale Low-Light Image Enhancement Network Based on U-Net

XU Chaoyue, YU Ying, HE Penghao, LI Miao, MA Yuhui

Journal:   DOAJ (DOAJ: Directory of Open Access Journals) Year: 2022
© 2026 ScienceGate Book Chapters — All rights reserved.