JOURNAL ARTICLE

Efficient Human Pose Estimation via Multi-Head Knowledge Distillation

Abstract

Nowadays, neural networks are becoming increasingly swelling to achieve high accuracy and better adaptability. For the task of human pose estimation, heavy neural networks are applied to achieve higher performance. But more extensive networks would lead to lower inference speed and consume more computing resources. How to acquire preferable accuracy even with a smaller model has become a valuable research subject. In this paper, we applied Knowledge Distillation to improve the performance with lightweight model. But still, the performance disparities between teacher networks and student networks exist. To further narrow the gap, we proposed the Multi-Headed Architecture to promote the accuracy of smaller models, raising performances to a similar level to larger models.

Keywords:
Adaptability Computer science Task (project management) Inference Artificial neural network Artificial intelligence Machine learning Distillation Estimation Engineering

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
19
Refs
0.17
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Human Pose and Action Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Hand Gesture Recognition Systems
Physical Sciences →  Computer Science →  Human-Computer Interaction
Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Pyramid Knowledge Distillation for Efficient Human Pose Estimation

Yang LiPeng JiaoHaoqian Wang

Journal:   2022 IEEE International Conference on Image Processing (ICIP) Year: 2022 Pages: 2177-2181
JOURNAL ARTICLE

Online Knowledge Distillation for Efficient Pose Estimation

Zheng LiJingwen YeMingli SongYing HuangZhigeng Pan

Journal:   2021 IEEE/CVF International Conference on Computer Vision (ICCV) Year: 2021 Pages: 11720-11730
© 2026 ScienceGate Book Chapters — All rights reserved.