JOURNAL ARTICLE

Image Restoration Based on Improved Generative Adversarial Networks

Yongsen LiJiana MengYuhai YuCunrui WangZhongyuan Guan

Year: 2022 Journal:   2022 7th International Conference on Image, Vision and Computing (ICIVC) Pages: 799-804

Abstract

Image restoration aims to recover image information from broken or missing regions, which plays an important role in the field of computer vision. There has been a major breakthrough in image restoration by using deep neural network models, but it is not effective in restoring above high-resolution image details. In order to solve this problem, this paper proposes a generative adversarial network with embedded channels and spatial attention, simultaneously, makes the network obtain a larger perceptual field by increasing the size of the expanded convolution, which leads to induce the network to learn the image details and broken edges better, so as to improve the restoration effect on details. The network proposed in this paper is trained and learned on a public datasets, and it is shown in the experimental results that the effect of the method on image restoration is improved to a certain extent.

Keywords:
Image restoration Computer science Convolution (computer science) Image (mathematics) Artificial intelligence Generative adversarial network Generative grammar Field (mathematics) Adversarial system Perception Computer vision Artificial neural network Image processing Mathematics

Metrics

2
Cited By
0.14
FWCI (Field Weighted Citation Impact)
28
Refs
0.42
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Image Processing Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Generative Adversarial Networks and Image Synthesis
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Image and Signal Denoising Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.