JOURNAL ARTICLE

Hyperbolic Insights With Knowledge Distillation for Cross-Domain Few-Shot Learning

Xi YangDechen KongNannan WangXinbo Gao

Year: 2025 Journal:   IEEE Transactions on Image Processing Vol: 34 Pages: 1921-1933   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Cross-domain few-shot learning aims to achieve swift generalization between a source domain and a target domain using a limited number of images. Current research predominantly relies on generalized feature embeddings, employing metric classifiers in Euclidean space for classification. However, due to existing disparities among different data domains, attaining generalized features in the embedding becomes challenging. Additionally, the rise in data domains leads to high-dimensional Euclidean spaces. To address the above problems, we introduce a cross-domain few-shot learning method named Hyperbolic Insights with Knowledge Distillation (HIKD). By integrating knowledge distillation, it enhances the model's generalization performance, thereby significantly improving task performance. Hyperbolic space, in comparison to Euclidean space, offers a larger capacity and supports the learning of hierarchical structures among images, which can aid generalized learning across different data domains. So we map the Euclidean space features to the hyperbolic space via hyperbolic embedding and utilize hyperbolic fitting distillation method in the meta-training phase to obtain multi-domain unified generalization representation. In the meta-testing phase, accounting for biases between the source and target domains, we present a hyperbolic adaptive module to adjust embedded features and eliminate inter-domain gap. Experiments on the Meta-Dataset demonstrate that HIKD outperforms state-of-the-arts methods with the average accuracy of 80.6%.

Keywords:
Computer science Distillation Shot (pellet) Artificial intelligence Domain (mathematical analysis) Domain knowledge One shot Machine learning Pattern recognition (psychology) Mathematics Engineering Materials science Chromatography Chemistry

Metrics

2
Cited By
7.00
FWCI (Field Weighted Citation Impact)
65
Refs
0.91
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Model Reduction and Neural Networks
Physical Sciences →  Physics and Astronomy →  Statistical and Nonlinear Physics
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.