JOURNAL ARTICLE

Target Guided Knowledge Distillation for Cross-Domain Few-Shot Learning

Abstract

Existing few-shot classification methods will suffer severe performance degradation when there is a domain gap between the base classes data used for training and the new classes data used for the downstream task. For this reason, cross-domain few-shot learning (CD-FSL) is trying to solve this problem that exists in few-shot learning (FSL), which breaks the restrictive hypothesis that the base (source) classes and the novel (target) classes are sampled from the same domain in FSL. Due to the domain gap between the source and target domain data, the model trained on the source domain often has a deviation when transferred to the target domain, which leads to the model not being well generalized to the target task and produces the problem of inadaptation. To overcome this problem, a new target guided knowledge distillation (TGKD) framework is proposed in this work. We design two distinct knowledge distillation methods to enhance the generalization of the model and its pertinence to the target task in the teacher training and knowledge distillation stages, respectively. Specifically, in the teacher training phase, we propose a self-cross distillation to reduce intra-class semantic variations, thereby improving the robustness of the model's feature representation. In the knowledge distillation phase, a dual-student network is proposed to alleviate the bias of the model when migrating to the target data by utilizing a small portion of unlabeled target samples. Extensive experimental results on several benchmark datasets demonstrate the effectiveness of our method.

Keywords:
Computer science Distillation Artificial intelligence Robustness (evolution) Machine learning Benchmark (surveying) Domain (mathematical analysis) Domain knowledge Task (project management) Data mining Mathematics Engineering

Metrics

1
Cited By
0.26
FWCI (Field Weighted Citation Impact)
24
Refs
0.61
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
COVID-19 diagnosis using AI
Health Sciences →  Medicine →  Radiology, Nuclear Medicine and Imaging
Geophysical Methods and Applications
Physical Sciences →  Engineering →  Ocean Engineering

Related Documents

JOURNAL ARTICLE

Knowledge-guided distribution alignment for cross-domain few-shot learning

Jiale ChenFeng XuXin LyuZeng TaoXin LiXin Li

Journal:   Knowledge-Based Systems Year: 2025 Vol: 329 Pages: 114316-114316
JOURNAL ARTICLE

Hyperbolic Insights With Knowledge Distillation for Cross-Domain Few-Shot Learning

Xi YangDechen KongNannan WangXinbo Gao

Journal:   IEEE Transactions on Image Processing Year: 2025 Vol: 34 Pages: 1921-1933
JOURNAL ARTICLE

TGDM: Target Guided Dynamic Mixup for Cross-Domain Few-Shot Learning

Linhai ZhuoYuqian FuJingjing ChenYixin CaoYu–Gang Jiang

Journal:   Proceedings of the 30th ACM International Conference on Multimedia Year: 2022 Pages: 6368-6376
© 2026 ScienceGate Book Chapters — All rights reserved.