BOOK-CHAPTER

Applying a Deep Learning Approach for Building Extraction From High-Resolution Remote Sensing Imagery

Dolonchapa PrabhakarPradeep Kumar Garg

Year: 2023 Advances in geospatial technologies book series Pages: 157-179   Publisher: IGI Global

Abstract

As data science applies to mapping buildings, great attention has been given to the potential of using deep learning and new data sources. However, given that convolutional neural networks (CNNs) dominate image classification tasks, automating the building extraction process is becoming more and more common. Increased access to unstructured data (such as imagery and text) and developments in deep learning and computer vision algorithms have improved the possibility of automating the extraction of building attributes from satellite images in a cost-effective and large-scale manner. By applying intelligent software-based solutions to satellite imageries, the manual process of acquiring features such as building footprints can be expedited. Manual feature acquisition is time-consuming and expensive. The buildings may be recovered from RGB photos and are extremely properly identified. This chapter offers suggestions to quicken the development of DL-centred building extraction techniques using remotely sensed images.

Keywords:
Computer science Convolutional neural network Deep learning Process (computing) Artificial intelligence RGB color model Feature extraction Satellite imagery Software Satellite High resolution Remote sensing Computer vision Machine learning Engineering Geography

Metrics

1
Cited By
1.05
FWCI (Field Weighted Citation Impact)
45
Refs
0.68
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Remote-Sensing Image Classification
Physical Sciences →  Engineering →  Media Technology
Automated Road and Building Extraction
Physical Sciences →  Engineering →  Ocean Engineering
Remote Sensing and LiDAR Applications
Physical Sciences →  Environmental Science →  Environmental Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.