JOURNAL ARTICLE

Knowledge Distillation-Based Domain-Invariant Representation Learning for Domain Generalization

Ziwei NiuJunkun YuanXu MaYingying XuJing LiuYen‐Wei ChenRuofeng TongLanfen Lin

Year: 2023 Journal:   IEEE Transactions on Multimedia Vol: 26 Pages: 245-255   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Domain generalization (DG) aims to generalize the knowledge learned from multiple source domains to unseen target domains. Existing DG techniques can be subsumed under two broad categories, i.e., domain-invariant representation learning and domain manipulation. Nevertheless, it is extremely difficult to explicitly augment or generate the unseen target data. And when source domain variety increases, developing a domain-invariant model by simply aligning more domain-specific information becomes more challenging. In this paper, we propose a simple yet effective method for domain generalization, named Knowledge Distillation based Domain-invariant Representation Learning (KDDRL), that learns domain-invariant representation while encouraging the model to maintain domain-specific features, which recently turned out to be effective for domain generalization. To this end, our method incorporates multiple auxiliary student models and one student leader model to perform a two-stage distillation. In the first-stage distillation, each domain-specific auxiliary student treats the ensemble of other auxiliary students' predictions as a target, which helps to excavate the domain-invariant representation. Also, we present an error removal module to prevent the transfer of faulty information by eliminating incorrect predictions compared to the true labels. In the second-stage distillation, the student leader model with domain-specific features combines the domain-invariant representation learned from the group of auxiliary students to make the final prediction. Extensive experiments and in-depth analysis on popular DG benchmark datasets demonstrate that our KDDRL significantly outperforms the current state-of-the-art methods.

Keywords:
Computer science Distillation Invariant (physics) Representation (politics) Domain (mathematical analysis) Artificial intelligence Generalization Benchmark (surveying) Machine learning Pattern recognition (psychology) Mathematics

Metrics

39
Cited By
9.96
FWCI (Field Weighted Citation Impact)
72
Refs
0.98
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Cancer-related molecular mechanisms research
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Cancer Research
© 2026 ScienceGate Book Chapters — All rights reserved.