JOURNAL ARTICLE

Adversarial Distillation for Efficient Recommendation with External Knowledge

Xu ChenYongfeng ZhangHongteng XuZheng QinHongyuan Zha

Year: 2018 Journal:   ACM Transactions on Information Systems Vol: 37 (1)Pages: 1-28

Abstract

Integrating external knowledge into the recommendation system has attracted increasing attention in both industry and academic communities. Recent methods mostly take the power of neural network for effective knowledge representation to improve the recommendation performance. However, the heavy deep architectures in existing models are usually incorporated in an embedded manner, which may greatly increase the model complexity and lower the runtime efficiency. To simultaneously take the power of deep learning for external knowledge modeling as well as maintaining the model efficiency at test time, we reformulate the problem of recommendation with external knowledge into a generalized distillation framework . The general idea is to free the complex deep architecture into a separate model, which is only used in the training phrase, while abandoned at test time. In particular, in the training phrase, the external knowledge is processed by a comprehensive teacher model to produce valuable information to teach a simple and efficient student model. Once the framework is learned, the teacher model is abandoned, and only the succinct yet enhanced student model is used to make fast predictions at test time. In this article, we specify the external knowledge as user review, and to leverage it in an effective manner, we further extend the traditional generalized distillation framework by designing a Selective Distillation Network (SDNet) with adversarial adaption and orthogonality constraint strategies to make it more robust to noise information. Extensive experiments verify that our model can not only improve the performance of rating prediction, but also can significantly reduce time consumption when making predictions as compared with several state-of-the-art methods.

Keywords:
Computer science Leverage (statistics) Distillation Artificial intelligence Machine learning Adversarial system Deep learning Artificial neural network Phrase

Metrics

57
Cited By
10.27
FWCI (Field Weighted Citation Impact)
56
Refs
0.98
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Recommender Systems and Techniques
Physical Sciences →  Computer Science →  Information Systems
Advanced Bandit Algorithms Research
Social Sciences →  Decision Sciences →  Management Science and Operations Research
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

BOOK-CHAPTER

Adversarial Variational Knowledge Distillation

Xuan TangTong Lin

Lecture notes in computer science Year: 2021 Pages: 558-569
JOURNAL ARTICLE

SIMPLE, FAST AND SCALABLE RECOMMENDATION SYSTEMS VIA EXTERNAL KNOWLEDGE DISTILLATION

D. V. AndrosovNadezhda I. Nedashkovskaya

Journal:   Radio Electronics Computer Science Control Year: 2025 Pages: 126-137
JOURNAL ARTICLE

Knowledge distillation meets recommendation: collaborative distillation for top-N recommendation

Jae-woong LeeMinjin ChoiLee SaelHyunjung ShimJongwuk Lee

Journal:   Knowledge and Information Systems Year: 2022 Vol: 64 (5)Pages: 1323-1348
© 2026 ScienceGate Book Chapters — All rights reserved.