JOURNAL ARTICLE

SKELETON-BASED ACTION RECOGNITION USING FEATURE FUSION FOR SPATIAL-TEMPORAL GRAPH CONVOLUTIONAL NETWORKS

Abstract

Human action recognition (HAR) has been used in a variety of applications such as gaming, healthcare, surveillance, and robotics. Research on utilizing data such as color, depth, and skeletal data has been extensively conducted to achieve high-performance HAR. Compared with color and depth data, skeletal data are more compact, therefore, they are more efficient for computation and storage. Moreover, skeletal data are invariant to clothing textures, background, and lighting conditions. With the booming of deep learning, HAR has received a lot of attention. Spatial-Temporal Graph Convolution Networks (ST-GCN) have proved to be state-of-the-art architecture for HAR using skeleton data. However, this does not hold when working with challenging datasets that contain incomplete and noisy skeletal data. In this paper, a new method is proposed for HAR by adding a Feature Fusion module and applying hyperparameter optimization. The performance of the proposed method is evaluated on the challenging dataset CMDFALL and the newly-built MICA-Action3D dataset. Experimental results show that the proposed method significantly improves the performance of ST-GCN on these challenging datasets.

Keywords:
Computer science Artificial intelligence Convolutional neural network Pattern recognition (psychology) Hyperparameter Graph Feature extraction Computation Deep learning Machine learning Algorithm

Metrics

1
Cited By
0.10
FWCI (Field Weighted Citation Impact)
28
Refs
0.44
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Human Pose and Action Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Context-Aware Activity Recognition Systems
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Technologies in Various Fields
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.