JOURNAL ARTICLE

Knowledge Distillation for Human Pose Estimation Using Channel Dropout Strategy

Abstract

The attention and utilization of big models have become a trend recently. From large-scale convolutional neural network (CNN) models to vision transformers (ViTs) which contain much more parameters, more and more big models are being adopted in every kind of task. Accuracy requirement is a major reason for using large models and the rapid development of computing hardware, which enables training for large-scale models, also stimulates further demanding of model accuracy reversely. But for mobile or edge computing, there is an inevitable trade-off between accuracy and real-time inference ability due to limited computing power on the terminal end. To make this trade-off more favourable for us, we proposed Knowledge Distillation with Channel Dropout Strategy (CDKD), which applies both the intermediate distillation and final distillation associated with the channel dropout strategy, managing to promote the accuracy of a small model while maintaining the same amount of parameters and FLOPs. In the experiments on the COCO2017 validation set, with same setup, ResNet-18 model training with our method outperfomed the baseline model.

Keywords:
Computer science Inference Convolutional neural network FLOPS Dropout (neural networks) Machine learning Artificial intelligence Transformer Edge computing Deep learning Channel (broadcasting) Task (project management) Enhanced Data Rates for GSM Evolution Voltage Engineering

Metrics

1
Cited By
0.18
FWCI (Field Weighted Citation Impact)
40
Refs
0.44
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Human Pose and Action Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.