JOURNAL ARTICLE

Attention-Guided Knowledge Distillation for Efficient Single-Stage Detector

Abstract

Knowledge distillation has been successfully applied in image classification for model acceleration. There are also some works employing this technique to object detection, but they all treat different feature regions equally when performing feature mimic. In this paper, we propose an end-to-end attention-guided knowledge distillation method to train efficient single-stage detectors with much smaller backbones. More specifically, we introduce an attention mechanism to prioritize the transfer of important knowledge by focusing on a sparse set of hard samples, leading to a more thorough distillation process. In addition, the proposed distillation method also provides an easy way to train efficient detectors without tedious ImageNet pre-training procedure. Extensive experiments on PASCAL VOC and CityPersons datasets demonstrate the effectiveness of the proposed approach. We achieve 57.96% and 69.48% mAP on VOC07 with the backbone of 1/8 VGG16 and 1/4 VGG16, greatly outperforming their ImageNet pre-trained counterparts by 11.7% and 7.1% respectively.

Keywords:
Pascal (unit) Distillation Computer science Detector Artificial intelligence Object detection Process (computing) Feature extraction Feature (linguistics) Pattern recognition (psychology) Machine learning Chemistry Chromatography

Metrics

4
Cited By
0.20
FWCI (Field Weighted Citation Impact)
22
Refs
0.48
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

AGS-SSD: Attention-Guided Sampling for 3D Single-Stage Detector

Hanxiang QianPeng WuBei SunShaojing Su

Journal:   Electronics Year: 2022 Vol: 11 (14)Pages: 2268-2268
JOURNAL ARTICLE

Efficient Visual Sentiment Detector using Knowledge Distillation

Hyun Wook KangKwang‐Il Kim

Journal:   The Journal of Korean Institute of Information Technology Year: 2021 Vol: 19 (11)Pages: 37-43
JOURNAL ARTICLE

Efficient Facial Landmark Detector by Knowledge Distillation

Yuyang Sha

Journal:   2021 16th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2021) Year: 2021 Pages: 1-8
JOURNAL ARTICLE

Balanced knowledge distillation for one-stage object detector

Sungwook LeeSeunghyun LeeByung Cheol Song

Journal:   Neurocomputing Year: 2022 Vol: 500 Pages: 394-404
© 2026 ScienceGate Book Chapters — All rights reserved.