JOURNAL ARTICLE

Semi-Supervised Contrastive Learning for Human Activity Recognition

Abstract

Recent developments in deep learning have motivated the use of deep neural networks in mobile sensing applications. Human Activity Recognition (HAR), as one of the most important mobile sensing applications, has enjoyed great success due to the utilization of deep neural networks. Motivated by the success of self-supervised learning frameworks in computer vision and natural language processing, self-supervised models have been proposed to efficiently leverage massive unlabeled data and reduce the labeling burden of HAR applications. Current approaches use self-supervised pre-training (with unlabeled data) followed by downstream training (with labeled data). However, we claim that labeled data can still help in the pre-training process and propose SemiC-HAR, a Semi-supervised Contrastive learning framework for HAR. SemiC-HAR efficiently uses both of the labeled and unlabeled data during the pre-training process and combines the advantages of supervised and self-supervised contrastive learning frameworks. We evaluate SemiC-HAR on six HAR datasets with multiple sensing signals and show comparable performance to previous supervised and semi-supervised models seen at much lower fractions of labeled data.

Keywords:
Computer science Artificial intelligence Leverage (statistics) Machine learning Labeled data Supervised learning Semi-supervised learning Deep learning Artificial neural network Process (computing)

Metrics

24
Cited By
1.94
FWCI (Field Weighted Citation Impact)
48
Refs
0.88
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Context-Aware Activity Recognition Systems
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Indoor and Outdoor Localization Technologies
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Human Pose and Action Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.