Abstract

The general principle of the multi-scale fusion is based on modelling of mixed pixels of the low resolution image and confuses classes of the high resolution image. This model involves taking into account not only the simple classes distinguishable at the two resolutions, but also union of classes not distinguishable through the resolutions. We present in this paper a work which focuses on the realization of a multi-scale images fusion process based on the Dempster-Shafer evidence theory (DST). The aim of the fusion process is the improvement of the land cover map by exploiting the rich spectral information of low spatial resolution images and the rich spatial information of high spatial resolution images. The multi-scale fusion using unsupervised model and supervised model allows generating new spectral classes indistinguishable on the high and low spatial resolution, thus obtaining land cover map rich in spectral and spatial information. However, if we have prior knowledge about the searched fusion classes, the application of the supervised model is recommended, otherwise it is preferable to apply the unsupervised model for all possible fusion classes, generated by the mathematical formalism of DST.

Keywords:
Dempster–Shafer theory Image fusion Fusion Artificial intelligence Computer science Image resolution Scale (ratio) Sensor fusion Pixel Formalism (music) Pattern recognition (psychology) Land cover Spatial analysis Computer vision Remote sensing Data mining Image (mathematics) Geography Land use Cartography Engineering

Metrics

5
Cited By
0.00
FWCI (Field Weighted Citation Impact)
6
Refs
0.12
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Image Fusion Techniques
Physical Sciences →  Engineering →  Media Technology
Geochemistry and Geologic Mapping
Physical Sciences →  Computer Science →  Artificial Intelligence
Remote-Sensing Image Classification
Physical Sciences →  Engineering →  Media Technology

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.