JOURNAL ARTICLE

Uncertainty-Aware Contrastive Distillation for Incremental Semantic Segmentation

Guanglei YangEnrico FiniDan XuPaolo RotaMingli DingMoin NabiXavier Alameda-PinedaElisa Ricci

Year: 2022 Journal:   IEEE Transactions on Pattern Analysis and Machine Intelligence Vol: 45 (2)Pages: 2567-2581   Publisher: IEEE Computer Society

Abstract

A fundamental and challenging problem in deep learning is catastrophic forgetting, i.e., the tendency of neural networks to fail to preserve the knowledge acquired from old tasks when learning new tasks. This problem has been widely investigated in the research community and several Incremental Learning (IL) approaches have been proposed in the past years. While earlier works in computer vision have mostly focused on image classification and object detection, more recently some IL approaches for semantic segmentation have been introduced. These previous works showed that, despite its simplicity, knowledge distillation can be effectively employed to alleviate catastrophic forgetting. In this paper, we follow this research direction and, inspired by recent literature on contrastive learning, we propose a novel distillation framework, Uncertainty-aware Contrastive Distillation (UCD). In a nutshell, UCDis operated by introducing a novel distillation loss that takes into account all the images in a mini-batch, enforcing similarity between features associated to all the pixels from the same classes, and pulling apart those corresponding to pixels from different classes. In order to mitigate catastrophic forgetting, we contrast features of the new model with features extracted by a frozen model learned at the previous incremental step. Our experimental results demonstrate the advantage of the proposed distillation technique, which can be used in synergy with previous IL approaches, and leads to state-of-art performance on three commonly adopted benchmarks for incremental semantic segmentation.

Keywords:
Computer science Natural language processing Artificial intelligence Segmentation Machine learning Data mining

Metrics

67
Cited By
12.92
FWCI (Field Weighted Citation Impact)
83
Refs
0.98
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Anomaly Detection Techniques and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Uncertainty-Aware and Decoupled Distillation for Semantic Segmentation

Cong LiGong ChengJunwei Han

Journal:   International Journal of Computer Vision Year: 2025 Vol: 134 (1)
JOURNAL ARTICLE

Prompt-Guided Semantic-Aware Distillation for Weakly Supervised Incremental Semantic Segmentation

Xuze HaoXuhao JiangWenqian NiWeimin TanBo Yan

Journal:   IEEE Transactions on Circuits and Systems for Video Technology Year: 2024 Vol: 34 (11)Pages: 10632-10645
JOURNAL ARTICLE

Difference-Aware Distillation for Semantic Segmentation

Jianping GouXiabin ZhouLan DuYibing ZhanWu ChenYi Zhang

Journal:   IEEE Transactions on Multimedia Year: 2024 Vol: 26 Pages: 10069-10080
JOURNAL ARTICLE

Hyperbolic Uncertainty Aware Semantic Segmentation

Bike ChenWei PengXiaofeng CaoJuha Röning

Journal:   IEEE Transactions on Intelligent Transportation Systems Year: 2023 Vol: 25 (2)Pages: 1275-1290
© 2026 ScienceGate Book Chapters — All rights reserved.