JOURNAL ARTICLE

Attention-based multi-scale network for hyperspectral image classification

Abstract

Hyperspectral images (HSIs) classification methods based on convolutional neural networks are becoming increasingly popular. Many proposed methods extract spatial and spectral features simultaneously, and the interaction between the two types of features leads to unsatisfied classification results. Moreover, most of the existing CNN-based methods mainly consider single-scale, which can cause some important information to be neglected. To address the aforementioned two issues, we propose a Attention-Based Multi-Scale Network (AMSN) for HSIs classification. First, the proposed network is based on 3D-CNN, through channel branch and spatial branch, the AMSN can capture more distinctive spatial and spectral features using different convolution kernels, respectively. Second, Local and global features are extracted by dense network. Then, two features are concatenated to make full use of multi-scale features. Third, attention blocks are used after each branch to achieve the most distinctive features. Experimental results on three HSIs datasets demonstrate that the proposed framework can achieve better classification performance than the several state-of-the-art methods.

Keywords:
Hyperspectral imaging Computer science Convolutional neural network Pattern recognition (psychology) Artificial intelligence Convolution (computer science) Scale (ratio) Kernel (algebra) Contextual image classification Feature extraction Image (mathematics) Artificial neural network Mathematics

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
30
Refs
0.16
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Remote-Sensing Image Classification
Physical Sciences →  Engineering →  Media Technology
Remote Sensing and Land Use
Physical Sciences →  Earth and Planetary Sciences →  Atmospheric Science
Advanced Image Fusion Techniques
Physical Sciences →  Engineering →  Media Technology
© 2026 ScienceGate Book Chapters — All rights reserved.