JOURNAL ARTICLE

Unsupervised Real-World Super Resolution with Cycle Generative Adversarial Network and Domain Discriminator

Abstract

This paper proposes an unsupervised single-image Super-Resolution(SR) model using cycleGAN and domain discriminator to solve the problem of SR with unknown degradation using unpaired dataset. In previous approaches, paired dataset is required for training with assumed levels of image degradation. In real world SR applications, however, training sets are typically not of low and high resolution image pairs, but only low resolution images with unknown degradation are provided as inputs. To address the problem, we introduce a cycle-in-cycle GAN based unsupervised learning model using an unpaired dataset. In addition, we combine several losses attributed to image contents, such as pixel-wise loss, VGG feature loss and SSIM loss, for stable learning and performance improvement. We also propose a domain discriminator, which consists of noise discriminator, texture discriminator and color discriminator, to guide generated images to follow target domain distribution rather than source domain. We validate effectiveness of our model in quantitative and qualitative experiments using NTIRE2020 real-world SR challenge dataset.

Keywords:
Discriminator Computer science Artificial intelligence Pattern recognition (psychology) Feature (linguistics) Image (mathematics) Domain (mathematical analysis) Unsupervised learning Noise (video) Feature extraction Image resolution Computer vision Mathematics Telecommunications

Metrics

55
Cited By
3.88
FWCI (Field Weighted Citation Impact)
36
Refs
0.94
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Image Processing Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Image Processing Techniques and Applications
Physical Sciences →  Engineering →  Media Technology
© 2026 ScienceGate Book Chapters — All rights reserved.