JOURNAL ARTICLE

Synthetic derivative T2-weighted abdominal images from T1-weighted images using a generative adversarial network (GAN)

Shu ZhangPhillip K. MartinNakul GuptaMaría I. AltbachAli BilginDiego Aponte

Year: 2024 Journal:   Proceedings on CD-ROM - International Society for Magnetic Resonance in Medicine. Scientific Meeting and Exhibition/Proceedings of the International Society for Magnetic Resonance in Medicine, Scientific Meeting and Exhibition

Abstract

Motivation: Either fast 2D T2-weighted abdominal imaging or 3D T2 MIP techniques have limitations. There remains a need for fast 3D T2 abdominal high-resolution imaging. Goal(s): To develop a conditional GAN model to synthesize T2-weighted images from 3D high-resolution T1-weighted abdominal images preserving spatial resolution of the source images. Approach: Abdominal images acquired from 39 volunteers were included for the study. A conditional GAN model was trained to generate T2-weighted images from T1-weighted images slice by slice. Results: Overall, the generated T2-weighted images were similar to the real T2-weighted images, though some contrast differences in the bowels and kidneys were seen. Impact: This proof of principle study shows the GAN model can be used to generate T2-weighted images from T1-weighted images, with the potential for rendering high quality volumetric 3D high-resolution abdominal T2-weighted images that is superior to current 3D MIP methods.

Keywords:
Generative adversarial network Adversarial system Derivative (finance) Computer science Generative grammar Artificial intelligence Pattern recognition (psychology) Algorithm Image (mathematics)

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.21
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Advanced X-ray and CT Imaging
Physical Sciences →  Engineering →  Biomedical Engineering
Image and Signal Denoising Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.