With the rapid development of the Internet, especially the extensive application of deep learning, change detection has been successfully applied in many fields, but the pursuit of greater feature information content has led to an increase in memory and computing power requirements. To solve this problem, a Lightweight Siamese attention Residual Network (LSRNet) is proposed in this paper to reduce the memory and computational power requirements, embed the Siamese Fast Small Attention (SFSAttention) to filter out feature information with less relevance, and then use feature fusion of channel dimensions to ensure feature richness and reduce the number of parameters of the model. The residual network module is introduced to extract the entire feature information and obtain the true change graph. Under the condition of ensuring accuracy, the number of parameters in the LEVIR-CD and CCD datasets is reduced by 3.7 M and 4 M, and the number of FLOPs is reduced by 2.5 G and 2.9 G, respectively.
Haiping YangYuanyuan ChenWei WuShiliang PuXiaoyang WuQiming WanWen Dong
Qinsheng DuShixuan ZhangNingbo ZhangChao ShenZ. Z. DuXin GuoJian Zhao
Hongyang YinChong MaLiguo WengMin XiaHaifeng Lin
Gisu HwangWooju LeeSeoung‐Jun Oh
Shuai PangChunjiao YouMin ZhangBaojie ZhangL. WangXiaolong ShiYu Sun