JOURNAL ARTICLE

Prototype-guided Unsupervised Domain Adaptation for Semantic Segmentation

Abstract

Semantic segmentation is one of the most important research directions in the field of computer vision, and has a wide range of applications for autonomous driving, medical imaging, intelligent security, etc. Unsupervised domain adaptation is the mainstream research topic in recent years, which can use a large number of labeled source samples to complete the segmentation task in target domain without labeled target samples. In this paper, we propose a prototype-guided unsupervised domain adaptation for semantic segmentation based on ProDA model. Due to lacking of labeled target samples and the prior probability, a prototype distance loss based on target domain is proposed to optimize the distribution of features by measuring the distance between features and the updated prototype and designing an adaptive threshold strategy. Meanwhile, a smoothing loss is proposed to alleviate the impact of source samples on our model and improve the prediction performance of the network. By conducting experiments on the GTA5 to Cityscapes scenarios, the results show that compared with the original model, the loss optimization improves mIoU by1.52.

Keywords:
Computer science Segmentation Artificial intelligence Domain (mathematical analysis) Adaptation (eye) Domain adaptation Smoothing Image segmentation Range (aeronautics) Field (mathematics) Machine learning Task (project management) Pattern recognition (psychology) Computer vision

Metrics

2
Cited By
0.51
FWCI (Field Weighted Citation Impact)
15
Refs
0.64
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
COVID-19 diagnosis using AI
Health Sciences →  Medicine →  Radiology, Nuclear Medicine and Imaging
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.