S. P. HuYikun HuJunyan LinFeng GaoJunyu Dong
Removing noise from hyperspectral images (HSIs) has been widely regarded as one of the most meaningful preprocessing tasks in remote sensing image interpretation. In this paper, we aim to extend the Transformer backbone to HSI denoising, and propose a Multi-scale Transformer Denoising Network (MTDNet). Specifically, we design a multi-head global attention module to alleviate the computational burden caused by self-attention. Furthermore, we propose a multi-scale feed-forward network in which three branches of multi-scale features are extracted through dilated convolution. It enriches the non-linear feature transformation in the Transformer block. Both the objective and subjective experiments on the ICVL dataset demonstrate the superiority of the proposed MTDNet over four closely related methods.
Qi ZhangYuwei DingWeiqi ZhangYian ZhuBob ZhangJerry Chun‐Wei Lin
Dhirendra Prasad YadavDeepak KumarAnand Singh JalalBhisham Sharma
Zhaojie PanSunjinyan DingGenyun SunAizhu ZhangXiuping JiaHang Fu