Jie ChenPeien HeJingru ZhuYa GuoGeng SunMin DengHaifeng Li
Deep learning-based semantic segmentation has been widely applied for building extraction. However, due to the domain gap, the extraction of building in high-resolution remote sensing imagery is difficult when the model trained on a source dataset is directly used to test on a target data. Considering that humans can retrieve memory to deal with correlative tasks in different domains, memory mechanisms have been developed effectively to assist cross-domain feature extraction. However, whether the memory mechanisms can achieve satisfactory result or not highly depends on the premise that the memory is relevant to the task. Therefore, the domain-invariant memory is crucial in cross-domain building extraction task. To this end, a memory-contrastive unsupervised domain adaptation method is proposed on the basis of a novel memory mechanism. Specifically, to facilitate the model to memorize domain-invariant features, we first conduct a normalization-based image style transfer strategy and a discriminator-based adversarial method at the image level and feature level, respectively. Subsequently, we carry out a memory-contrastive module to obtain domain-invariant features. Especially, a teacher–student network is exploited to help knowledge transferring by knowledge distillation to enhance the performance of the memory-contracted module. To narrow the distance between the two domains, a memory bank is designed to store and update category features obtained from the source domain, and then the similarity between category features in the target domain and memory bank is calculated. Results of the cross-domain experiments show that the proposed method can achieve optimal building extraction. (GitHub: https://github.com/RS-CSU/MDANet).
Jingru ZhuYa GuoGeng SunLibo YangMin DengJie Chen
Jie ChenJingru ZhuPeien HeYa GuoLiang HongYin YangMin DengGeng Sun
Fang FangXu RuiShengwen LiQingyi HaoKang ZhengKaishun WuBo Wan
Jun GuoHua ZhangZhuoyi JiangZijun YangDezhi Liu
Wenzao ShiZhengyuan MaoJinqing Liu