JOURNAL ARTICLE

Class Similarity Weighted Knowledge Distillation for Continual Semantic Segmentation

Vu Minh Hieu PhanThe-Anh TaSon Lam PhungLong Tran-ThanhAbdesselam Bouzerdoum

Year: 2022 Journal:   2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Pages: 16845-16854

Abstract

Deep learning models are known to suffer from the problem of catastrophic forgetting when they incrementally learn new classes. Continual learning for semantic segmentation (CSS) is an emerging field in computer vision. We identify a problem in CSS: A model tends to be confused between old and new classes that are visually similar, which makes it forget the old ones. To address this gap, we propose REMINDER - a new CSS framework and a novel class similarity knowledge distillation (CSW-KD) method. Our CSW-KD method distills the knowledge of a previous model on old classes that are similar to the new one. This provides two main benefits: (i) selectively revising old classes that are more likely to be forgotten, and (ii) better learning new classes by relating them with the previously seen classes. Extensive experiments on Pascal-Voc 2012 and ADE20k datasets show that our approach outperforms state-of-the-art methods on standard CSS settings by up to 7.07% and 8.49%, respectively.

Keywords:
Forgetting Pascal (unit) Computer science Artificial intelligence Segmentation Class (philosophy) Machine learning Similarity (geometry) Deep learning Semantic similarity Field (mathematics) Image (mathematics) Mathematics

Metrics

54
Cited By
6.23
FWCI (Field Weighted Citation Impact)
55
Refs
0.97
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
COVID-19 diagnosis using AI
Health Sciences →  Medicine →  Radiology, Nuclear Medicine and Imaging
© 2026 ScienceGate Book Chapters — All rights reserved.