JOURNAL ARTICLE

Self-Supervised Feature Contrastive Learning for Small Weak Object Detection in Remote Sensing

Zheng LiXueyan HuJin QianTianqi ZhaoDongdong XuYongcheng Wang

Year: 2025 Journal:   Remote Sensing Vol: 17 (8)Pages: 1438-1438   Publisher: Multidisciplinary Digital Publishing Institute

Abstract

Despite advances in remote sensing object detection, accurately identifying small, weak objects remains challenging. Their limited pixel representation often fails to capture distinctive features, making them susceptible to environmental interference. Current detectors frequently miss these subtle feature variations. To address these challenges, we propose FCDet, a feature contrast-based detector for small, weak objects. Our approach introduces: (1) a spatial-guided feature upsampler (SGFU) that aligns features by adaptive sampling based on spatial distribution, thus achieving fine-grained alignment during feature aggregation; (2) a feature contrast head (FCH) that projects GT and RoI features into an embedding space for discriminative learning; and (3) an instance-controlled label assignment (ICLA) strategy that optimizes sample selection for feature contrastive learning. We conduct comprehensive experiments on challenging datasets, with the proposed method achieving 73.89% mAP on DIOR, 95.04% mAP on NWPU VHR-10, and 26.4% AP on AI-TOD, demonstrating its effectiveness and superior performance.

Keywords:
Computer science Remote sensing Feature (linguistics) Artificial intelligence Pattern recognition (psychology) Geology

Metrics

6
Cited By
21.10
FWCI (Field Weighted Citation Impact)
60
Refs
0.98
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Remote-Sensing Image Classification
Physical Sciences →  Engineering →  Media Technology
Remote Sensing and Land Use
Physical Sciences →  Earth and Planetary Sciences →  Atmospheric Science
Face and Expression Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.