JOURNAL ARTICLE

Relation-Based Multi-Teacher Knowledge Distillation

Keywords:
Relation (database) Distillation Computer science Chromatography Chemistry Data mining

Metrics

2
Cited By
3.29
FWCI (Field Weighted Citation Impact)
19
Refs
0.86
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Innovative Teaching and Learning Methods
Social Sciences →  Psychology →  Developmental and Educational Psychology
Intelligent Tutoring Systems and Adaptive Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Education and Critical Thinking Development
Social Sciences →  Social Sciences →  Education

Related Documents

JOURNAL ARTICLE

Anomaly detection based on multi-teacher knowledge distillation

Ye MaJiang XuNan GuanYi Wang

Journal:   Journal of Systems Architecture Year: 2023 Vol: 138 Pages: 102861-102861
JOURNAL ARTICLE

Adaptive multi-teacher multi-level knowledge distillation

Yuang LiuWei ZhangJun Wang

Journal:   Neurocomputing Year: 2020 Vol: 415 Pages: 106-113
JOURNAL ARTICLE

Confidence-Aware Multi-Teacher Knowledge Distillation

Hailin ZhangDefang ChenCan Wang

Journal:   ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) Year: 2022 Pages: 4498-4502
© 2026 ScienceGate Book Chapters — All rights reserved.