JOURNAL ARTICLE

Attention and feature transfer based knowledge distillation

Guoliang YangShuaiying YuYangyang ShengHao Yang

Year: 2023 Journal:   Scientific Reports Vol: 13 (1)Pages: 18369-18369   Publisher: Nature Portfolio
Keywords:
Computer science Inference Feature (linguistics) Benchmark (surveying) Artificial intelligence Block (permutation group theory) Knowledge transfer Machine learning Process (computing) Distillation Artificial neural network Knowledge management Mathematics

Metrics

11
Cited By
2.00
FWCI (Field Weighted Citation Impact)
35
Refs
0.84
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Machine Learning and ELM
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Knowledge Distillation With Feature Self Attention

Sin-Gu ParkDong‐Joong Kang

Journal:   IEEE Access Year: 2023 Vol: 11 Pages: 34554-34562
JOURNAL ARTICLE

Attention-based Feature Interaction for Efficient Online Knowledge Distillation

Tongtong SuQiyu LiangJinsong ZhangZhaoyang YuGang WangXiaoguang Liu

Journal:   2021 IEEE International Conference on Data Mining (ICDM) Year: 2021 Pages: 579-588
JOURNAL ARTICLE

Artistic Style Transfer Based on Attention with Knowledge Distillation

Hanadi Al‐MekhlafiShiguang Liu

Journal:   Computer Graphics Forum Year: 2024 Vol: 43 (6)
JOURNAL ARTICLE

Hierarchical Multi-Attention Transfer for Knowledge Distillation

Jianping GouLiyuan SunBaosheng YuShaohua WanDacheng Tao

Journal:   ACM Transactions on Multimedia Computing Communications and Applications Year: 2022 Vol: 20 (2)Pages: 1-20
© 2026 ScienceGate Book Chapters — All rights reserved.