BOOK-CHAPTER

Correlation Guided Multi-teacher Knowledge Distillation

Luyao ShiNing JiangJialiang TangXinlei Huang

Year: 2023 Lecture notes in computer science Pages: 562-574   Publisher: Springer Science+Business Media
Keywords:
Computer science Distillation Preference Knowledge transfer Perspective (graphical) Limiting Knowledge engineering Knowledge acquisition Mathematics education Machine learning Artificial intelligence Knowledge management Mathematics Engineering

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
26
Refs
0.43
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Adaptive multi-teacher multi-level knowledge distillation

Yuang LiuWei ZhangJun Wang

Journal:   Neurocomputing Year: 2020 Vol: 415 Pages: 106-113
JOURNAL ARTICLE

Confidence-Aware Multi-Teacher Knowledge Distillation

Hailin ZhangDefang ChenCan Wang

Journal:   ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) Year: 2022 Pages: 4498-4502
JOURNAL ARTICLE

Densely Guided Knowledge Distillation using Multiple Teacher Assistants

Wonchul SonJaemin NaJun Yong ChoiWonjun Hwang

Journal:   2021 IEEE/CVF International Conference on Computer Vision (ICCV) Year: 2021 Pages: 9375-9384
© 2026 ScienceGate Book Chapters — All rights reserved.