JOURNAL ARTICLE

Re-GAN: Data-Efficient GANs Training via Architectural Reconfiguration

Abstract

Training Generative Adversarial Networks (GANs) on high-fidelity images usually requires a vast number of training images. Recent research on GAN tickets reveals that dense GANs models contain sparse sub-networks or "lottery tickets" that, when trained separately, yield better results under limited data. However, finding GANs tickets requires an expensive process of train-prune-retrain. In this paper, we propose Re-GAN, a data-efficient GANs training that dynamically reconfigures GANs architecture during training to explore different sub-network structures in training time. Our method repeatedly prunes unimportant connections to regularize GANs network and regrows them to reduce the risk of prematurely pruning important connections. Re-GAN stabilizes the GANs models with less data and offers an alternative to the existing GANs tickets and progressive growing methods. We demonstrate that Re-GAN is a generic training methodology which achieves stability on datasets of varying sizes, domains, and resolutions (CIFAR-10, Tiny-ImageNet, and multiple few-shot generation datasets) as well as different GANs architectures (SNGAN, ProGAN, StyleGAN2 and AutoGAN). Re-GAN also improves performance when combined with the recent augmentation approaches. Moreover, Re-GAN requires fewer floating-point operations (FLOPs) and less training time by removing the unimportant connections during GANs training while maintaining comparable or even generating higher-quality samples. When compared to state-of-the-art StyleGAN2, our method outperforms without requiring any additional fine-tuning step. Code can be found at this link: https://github.com/IntellicentAI-Lab/Re-GAN

Keywords:
Computer science FLOPS Overhead (engineering) Pruning Computer engineering Code (set theory) Network architecture Process (computing) Training (meteorology) Artificial intelligence Training set Machine learning Pattern recognition (psychology) Parallel computing Computer network Operating system

Metrics

28
Cited By
5.10
FWCI (Field Weighted Citation Impact)
91
Refs
0.95
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Generative Adversarial Networks and Image Synthesis
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Image Processing Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Digital Media Forensic Detection
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Re-GAN: Data-Efficient GANs Training via Architectural Reconfiguration

Divya SaxenaJiannong CaoJiahao XuTarun Kulshrestha

Journal:   IEEE Transactions on Pattern Analysis and Machine Intelligence Year: 2025 Vol: 47 (11)Pages: 9580-9596
JOURNAL ARTICLE

DM-GAN: CNN hybrid vits for training GANs under limited data

Longquan YanRuixiang YanBosong ChaiGuohua GengPengbo ZhouJianxi Gao

Journal:   Pattern Recognition Year: 2024 Vol: 156 Pages: 110810-110810
JOURNAL ARTICLE

Consistency-GAN: Training GANs with Consistency Model

Yunpeng WangMeng PangShengbo ChenHong Rao

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2024 Vol: 38 (14)Pages: 15743-15751
BOOK-CHAPTER

Fictitious GAN: Training GANs with Historical Models

Hao GeXia YinXu ChenRandall BerryYing Wu

Lecture notes in computer science Year: 2018 Pages: 122-137
© 2026 ScienceGate Book Chapters — All rights reserved.