JOURNAL ARTICLE

Adaptive multi-teacher multi-level knowledge distillation

Yuang LiuWei ZhangJun Wang

Year: 2020 Journal:   Neurocomputing Vol: 415 Pages: 106-113   Publisher: Elsevier BV
Keywords:
Computer science Distillation Representation (politics) Bridge (graph theory) Competitor analysis Machine learning Mathematics education Artificial intelligence Mathematics Chromatography Chemistry

Metrics

188
Cited By
12.63
FWCI (Field Weighted Citation Impact)
52
Refs
0.99
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

BOOK-CHAPTER

Adaptive Multi-teacher Knowledge Distillation with Class Attention Transfer

Xin ChengJinjia Zhou

Communications in computer and information science Year: 2025 Pages: 210-224
BOOK-CHAPTER

Correlation Guided Multi-teacher Knowledge Distillation

Luyao ShiNing JiangJialiang TangXinlei Huang

Lecture notes in computer science Year: 2023 Pages: 562-574
© 2026 ScienceGate Book Chapters — All rights reserved.