JOURNAL ARTICLE

Geospatial Object Detection In Remote Sensing Images Based On Multi-Scale Convolutional Neural Networks

Abstract

Automatic object detection is a basic but challenging problem in remote sensing images (RSIs) interpretation. Recently, a context-based top-down detection architecture has been proposed, which generates high-quality fusion features at all scales for object detection and significantly improves the accuracy of traditional detection framework. However, in the top-down architecture, small objects are easily lost in deep layers and the context cues will be weakened simultaneously. In this paper, to tackle these problems mentioned above, a novel Multi-scale Detection Network (MSDN) is proposed. The proposed method maintains the resolution of deep features, which enhances the capability of multi-scale objects feature expression. Meanwhile, a dilated bottleneck structure is introduced, which effectively enlarges the receptive filed and improves the regression ability of multi-scale objects. The proposed method is evaluated on NWPU VHR-10 benchmarks and achieves impressive improvement over the comparable state-of-the-art detection framworks.

Keywords:
Computer science Convolutional neural network Bottleneck Object detection Artificial intelligence Context (archaeology) Scale (ratio) Geospatial analysis Feature (linguistics) Object (grammar) Pattern recognition (psychology) Computer vision Feature extraction Remote sensing Geography Embedded system

Metrics

20
Cited By
1.28
FWCI (Field Weighted Citation Impact)
20
Refs
0.84
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Remote-Sensing Image Classification
Physical Sciences →  Engineering →  Media Technology
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.