JOURNAL ARTICLE

Balanced knowledge distillation for one-stage object detector

Sungwook LeeSeunghyun LeeByung Cheol Song

Year: 2022 Journal:   Neurocomputing Vol: 500 Pages: 394-404   Publisher: Elsevier BV

Abstract

The latest knowledge distillation (KD) methods have successfully supervised a student model to have a better representation using intermediate layers of a teacher model. However, the previous KD methods did not obtain generalized knowledge for various object scales from a one-stage object detector because the one-stage object detector has a structural property that uses several intermediate layers to extract objects of various scales. In other words, the previous KD methods could not distill and transfer knowledge to intermediate layers of one-stage object detectors in a balanced way. Therefore, we propose a shared knowledge encoder and an averaged prototype transfer to remove or mitigate the distillation and transfer imbalances that adversely affect the KD process. Experimental results show that the proposed KD method outperforms the state-of-the-art methods. For instance, the proposed method provides about 1.3% and 2.2% higher accuracy than the baseline on the PASCAL VOC and MS COCO datasets, respectively.

Keywords:
Pascal (unit) Detector Computer science Distillation Object (grammar) Encoder Artificial intelligence Representation (politics) Process (computing) Object detection Property (philosophy) Pattern recognition (psychology) Chemistry Chromatography Programming language

Metrics

4
Cited By
0.50
FWCI (Field Weighted Citation Impact)
40
Refs
0.58
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Machine Learning and Data Classification
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

GAN-Knowledge Distillation for One-Stage Object Detection

Wanwei WangWei HongFeng WangJinke Yu

Journal:   IEEE Access Year: 2020 Vol: 8 Pages: 60719-60727
JOURNAL ARTICLE

One-stage object detection knowledge distillation via adversarial learning

Na DongYongqiang ZhangMingli DingShibiao XuYancheng Bai

Journal:   Applied Intelligence Year: 2021 Vol: 52 (4)Pages: 4582-4598
JOURNAL ARTICLE

Integrated Knowledge Distillation for Efficient One-stage Object Detection Network

Zixu Cheng

Journal:   Applied and Computational Engineering Year: 2023 Vol: 2 (1)Pages: 819-825
BOOK-CHAPTER

WOODWIND: Few-Shot Object Detector with Knowledge Distillation

Mengyuan Ma

Lecture notes in computer science Year: 2025 Pages: 78-91
BOOK-CHAPTER

An Autonomous Driving Object Detector Based on Knowledge Distillation

Yang LiJiaxin LiHe LiuHuakun Zhang

Lecture notes in electrical engineering Year: 2025 Pages: 209-219
© 2026 ScienceGate Book Chapters — All rights reserved.