JOURNAL ARTICLE

Uncertainty-Guided Semi-Supervised Few-Shot Class-Incremental Learning With Knowledge Distillation

Yawen CuiWanxia DengXin XuZhen LiuZhong LiuMatti PietikäinenLi Liu

Year: 2022 Journal:   IEEE Transactions on Multimedia Vol: 25 Pages: 6422-6435   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Class-Incremental Learning (CIL) aims at incrementally learning novel classes without forgetting old ones. This capability becomes more challenging when novel tasks contain one or a few labeled training samples, which leads to a more practical learning scenario, i . e ., Few-Shot Class- Incremental Learning (FSCIL). The dilemma on FSCIL lies in serious overfitting and exacerbated catastrophic forgetting caused by the limited training data from novel classes. In this paper, excited by the easy accessibility of unlabeled data, we conduct a pioneering work and focus on a Semi-Supervised Few-Shot Class-Incremental Learning (Semi-FSCIL) problem, which requires the model incrementally to learn new classes from extremely limited labeled samples and a large number of unlabeled samples. To address this problem, a simple but efficient framework is first constructed based on the knowledge distillation technique to alleviate catastrophic forgetting. To efficiently mitigate the overfitting problem on novel categories with unlabeled data, uncertainty-guided semi-supervised learning is incorporated into this framework to select unlabeled samples into incremental learning sessions considering the model uncertainty. This process provides extra reliable supervision for the distillation process and contributes to better formulating the class means. Our extensive experiments on CIFAR100, miniImageNet and CUB200 datasets demonstrate the promising performance of our proposed method, and define baselines in this new research direction.

Keywords:
Overfitting Computer science Artificial intelligence Forgetting Machine learning Class (philosophy) Pruning Process (computing) Focus (optics) Generalization Artificial neural network Mathematics

Metrics

31
Cited By
6.07
FWCI (Field Weighted Citation Impact)
94
Refs
0.95
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
COVID-19 diagnosis using AI
Health Sciences →  Medicine →  Radiology, Nuclear Medicine and Imaging

Related Documents

JOURNAL ARTICLE

Uncertainty-Aware Distillation for Semi-Supervised Few-Shot Class-Incremental Learning

Yawen CuiWanxia DengHaoyu ChenLi Liu

Journal:   IEEE Transactions on Neural Networks and Learning Systems Year: 2023 Vol: 35 (10)Pages: 14259-14272
JOURNAL ARTICLE

Few-Shot Class-Incremental Learning via Relation Knowledge Distillation

Songlin DongXiaopeng HongXiaoyu TaoXinyuan ChangXing WeiYihong Gong

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2021 Vol: 35 (2)Pages: 1255-1263
© 2026 ScienceGate Book Chapters — All rights reserved.