JOURNAL ARTICLE

Few-Shot Class-Incremental Learning via Relation Knowledge Distillation

Songlin DongXiaopeng HongXiaoyu TaoXinyuan ChangXing WeiYihong Gong

Year: 2021 Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Vol: 35 (2)Pages: 1255-1263   Publisher: Association for the Advancement of Artificial Intelligence

Abstract

In this paper, we focus on the challenging few-shot class incremental learning (FSCIL) problem, which requires to transfer knowledge from old tasks to new ones and solves catastrophic forgetting. We propose the exemplar relation distillation incremental learning framework to balance the tasks of old-knowledge preserving and new-knowledge adaptation. First, we construct an exemplar relation graph to represent the knowledge learned by the original network and update gradually for new tasks learning. Then an exemplar relation loss function for discovering the relation knowledge between different classes is introduced to learn and transfer the structural information in relation graph. A large number of experiments demonstrate that relation knowledge does exist in the exemplars and our approach outperforms other state-of-the-art class-incremental learning methods on the CIFAR100, miniImageNet, and CUB200 datasets.

Keywords:
Forgetting Relation (database) Computer science Artificial intelligence Class (philosophy) Graph Machine learning Knowledge transfer Knowledge graph Theoretical computer science Data mining

Metrics

136
Cited By
12.88
FWCI (Field Weighted Citation Impact)
67
Refs
0.99
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.