JOURNAL ARTICLE

Self-Distillation for Unsupervised 3D Domain Adaptation

Abstract

Point cloud classification is a popular task in 3D vision. However, previous works, usually assume that point clouds at test time are obtained with the same procedure or sensor as those at training time. Unsupervised Domain Adaptation (UDA) instead, breaks this assumption and tries to solve the task on an unlabeled target domain, leveraging only on a supervised source domain. For point cloud classification, recent UDA methods try to align features across domains via auxiliary tasks such as point cloud reconstruction, which however do not optimize the discriminative power in the target domain in feature space. In contrast, in this work, we focus on obtaining a discriminative feature space for the target domain enforcing consistency between a point cloud and its augmented version. We then propose a novel iterative self-training methodology that exploits Graph Neural Networks in the UDA context to refine pseudo-labels. We perform extensive experiments and set the new state-of-the art in standard UDA benchmarks for point cloud classification. Finally, we show how our approach can be extended to more complex tasks such as part segmentation.

Keywords:
Computer science Point cloud Discriminative model Artificial intelligence Machine learning Domain (mathematical analysis) Pattern recognition (psychology) Segmentation Feature extraction Context (archaeology) Focus (optics) Cloud computing Mathematics

Metrics

18
Cited By
2.60
FWCI (Field Weighted Citation Impact)
84
Refs
0.89
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
3D Shape Modeling and Analysis
Physical Sciences →  Engineering →  Computational Mechanics
Human Pose and Action Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Transferable adversarial masked self-distillation for unsupervised domain adaptation

Yuelong XiaLijun YunChengfu Yang

Journal:   Complex & Intelligent Systems Year: 2023 Vol: 9 (6)Pages: 6567-6580
JOURNAL ARTICLE

Knowledge distillation for BERT unsupervised domain adaptation

Minho RyuGeonseok LeeKichun Lee

Journal:   Knowledge and Information Systems Year: 2022 Vol: 64 (11)Pages: 3113-3128
JOURNAL ARTICLE

Self-corrected unsupervised domain adaptation

Yunyun WangChao WangHui XueSongcan Chen

Journal:   Frontiers of Computer Science Year: 2021 Vol: 16 (5)
© 2026 ScienceGate Book Chapters — All rights reserved.