JOURNAL ARTICLE

Parallel Graph Attention Network Model Based on Pixel and Superpixel Feature Fusion for Hyperspectral Image Classification

Abstract

With the development of hyperspectral sensors, there is an increasing amount of accessible hyperspectral data, and the classification task for land cover categories has gained significant attention. Existing classification methods typically extract features from either the pixel or superpixel perspective. However, using a single-scale feature extraction approach fails to simultaneously consider both local and global features of land cover, leading to suboptimal classification results. To address this issue, this paper proposes a parallel graph attention network model based on pixel and superpixel feature fusion (SSPGAT) for hyperspectral image classification, which leverages the fusion of pixel-level and superpixel-level features. The proposed approach first employs spectral convolutional layers to reduce the redundant spectral dimension. Then, it utilizes graph attention network (GAT) to extract local and global features of land cover separately from the pixel and superpixel perspectives. Finally, a fully connected network is employed to classify the fused features from both branches. Experimental results on two different datasets demonstrate the effectiveness of the proposed approach.

Keywords:
Hyperspectral imaging Computer science Pattern recognition (psychology) Pixel Artificial intelligence Graph Feature extraction Feature (linguistics) Convolutional neural network Land cover Land use Theoretical computer science Engineering

Metrics

3
Cited By
0.65
FWCI (Field Weighted Citation Impact)
12
Refs
0.70
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Remote-Sensing Image Classification
Physical Sciences →  Engineering →  Media Technology
Advanced Chemical Sensor Technologies
Physical Sciences →  Engineering →  Biomedical Engineering
Advanced Image Fusion Techniques
Physical Sciences →  Engineering →  Media Technology
© 2026 ScienceGate Book Chapters — All rights reserved.