JOURNAL ARTICLE

Bilateral Knowledge Distillation for Unsupervised Domain Adaptation of Semantic Segmentation

Yunnan WangJianxun Li

Year: 2022 Journal:   2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Pages: 10177-10184

Abstract

Unsupervised domain adaptation (UDA) aims to learn domain-invariant representations between the labeled source domain and the unlabeled target domain. Existing self- training-based UDA methods use ground truth and pseudo- labels to supervise source data and target data respectively. However, strong supervision in the source domain and pseudo- label noise in the target domain lead to some problems, such as biased predictions and over-fitting. To tackle these issues, we propose a novel Bilateral Knowledge Distillation (BKD) framework for UDA in semantic segmentation, which adopts different knowledge distillation strategies depending on the domain. Specifically, we first introduce a Source-Flow Distillation (SD) to smooth the labels of source images, which weakens the supervision in the source domain. Meanwhile, a Target-Flow Distillation (TD) is designed to extract the inter- class knowledge in the probability map output from the teacher model, which alleviates the influence of pseudo-label noise in the target domain. Considering the class imbalance in semantic segmentation, we further propose an Image-Wise Hard Pixel Mining (HPM) to address this issue without estimating class frequency in the unlabeled target domain. The effectiveness of our framework against existing state-of-the-art methods is demonstrated by extensive experiments on two benchmarks: GTA5-to-Cityscapes and SYNTHIA-to-Cityscapes.

Keywords:
Computer science Segmentation Distillation Domain adaptation Artificial intelligence Domain (mathematical analysis) Image (mathematics) Noise (video) Class (philosophy) Pattern recognition (psychology) Machine learning Computer vision Mathematics

Metrics

4
Cited By
0.47
FWCI (Field Weighted Citation Impact)
41
Refs
0.60
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
COVID-19 diagnosis using AI
Health Sciences →  Medicine →  Radiology, Nuclear Medicine and Imaging
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Cross-Domain Correlation Distillation for Unsupervised Domain Adaptation in Nighttime Semantic Segmentation

Huan GaoJichang GuoGuoli WangQian Zhang

Journal:   2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Year: 2022 Pages: 9903-9913
JOURNAL ARTICLE

Multi-Head Distillation for Continual Unsupervised Domain Adaptation in Semantic Segmentation

Antoine SaportaArthur DouillardTuan-Hung VuPatrick PérezMatthieu Cord

Journal:   2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) Year: 2022
JOURNAL ARTICLE

Knowledge distillation for BERT unsupervised domain adaptation

Minho RyuGeonseok LeeKichun Lee

Journal:   Knowledge and Information Systems Year: 2022 Vol: 64 (11)Pages: 3113-3128
JOURNAL ARTICLE

Rethinking unsupervised domain adaptation for semantic segmentation

Zhijie WangMasanori SuganumaTakayuki Okatani

Journal:   Pattern Recognition Letters Year: 2024 Vol: 186 Pages: 119-125
© 2026 ScienceGate Book Chapters — All rights reserved.