Siyuan HaoYufeng XiaYuanxin Ye
In recent years, generative adversarial networks (GAN) have made great progress in the field of hyperspectral image classification (HIC), which alleviates the problem of insufficient training samples to a large extent. At present, GAN in the field of HIC are all based on Convolutional Neural Network (CNN). But CNN cannot extract sequence information well, and it is difficult to model remote dependencies. However, hyperspectrum is rich in spectral sequence information, and Transformer has been proven to be good at processing sequence information. Therefore, in order to process spectral information and alleviate the problem of insufficient training samples of hyperspectral images (HSI), we put forward a new frame Transformer with residual upscale GAN (TRUG). TRUG includes a generator G and a discriminator D. In the G, we propose residual upscale (RU) to improve the resolution of generated features, while also extracting texture features and capturing context relationships. In addition, we visualized the generated fake images for more intuitive analysis. In the D, we adopt Transformer block with progressively decreasing scale, and use grid self-attention mechanism in the first layer to better extract features for classification. In addition, GAN are prone to the problem of unstable training. In order to solve this problem, we improve the normalization algorithm and add relative position coding. We applied a pure Transformer based GAN to HIC datasets. Experimental results show that the proposed TRUG model has better performance than other models on the three datasets.
Ziping HeKewen XiaPedram GhamisiYu Hen HuShurui FanBaokai Zu
Yajie WangZhonghui ShiShengyu HanZhihao Wei
Weiye WangHeng-Chao LiYang‐Jun DengLiyang ShaoXiaoqiang LuQian Du