BOOK-CHAPTER

New Neural Buildings Stereo Matching Method Applied to Very High Resolution Ikonos Images

Ehlem Zigh

Year: 2015 Advances in computational intelligence and robotics book series Pages: 322-350   Publisher: IGI Global

Abstract

The author introduces a new neural stereo matching method using very high resolution IKONOS images. They do not have the parameters of the images acquisition system or other technological resources like digital elevation model, Lidar, or Laser data. These images contain dense urban scenes including various kinds of roads, cars, vegetation, and builds. The author is interested by buildings; they have different shapes, positions, and intensity levels or colours, so they make a lot of “false matches.” To solve this issue, the authors extracts regions of buildings at first; after that, she proposes a neural stereo matching method. A neural field is chosen due to its good management of imprecision and uncertainty relatives to real problems in general and to this one in particular. To show the effectiveness of a proposed method, the chapter contains at first details about encountered problems, and secondly, it explains the stereo matching process, its different kinds, and a chosen approach; thirdly, it gives obtained results using panchromatic and colour images.

Keywords:
Panchromatic film Artificial intelligence Computer vision Matching (statistics) Computer science Lidar Process (computing) Field (mathematics) High resolution Image (mathematics) Pattern recognition (psychology) Remote sensing Geography Mathematics

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
38
Refs
0.25
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Robotics and Sensor-Based Localization
Physical Sciences →  Engineering →  Aerospace Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.