Pourya ShamsolmoaliJocelyn ChanussotHuiyu ZhouYue Lu
Efficient object detection methods have recently received great attention in\nremote sensing. Although deep convolutional networks often have excellent\ndetection accuracy, their deployment on resource-limited edge devices is\ndifficult. Knowledge distillation (KD) is a strategy for addressing this issue\nsince it makes models lightweight while maintaining accuracy. However, existing\nKD methods for object detection have encountered two constraints. First, they\ndiscard potentially important background information and only distill nearby\nforeground regions. Second, they only rely on the global context, which limits\nthe student detector's ability to acquire local information from the teacher\ndetector. To address the aforementioned challenges, we propose Attention-based\nFeature Distillation (AFD), a new KD approach that distills both local and\nglobal information from the teacher detector. To enhance local distillation, we\nintroduce a multi-instance attention mechanism that effectively distinguishes\nbetween background and foreground elements. This approach prompts the student\ndetector to focus on the pertinent channels and pixels, as identified by the\nteacher detector. Local distillation lacks global information, thus attention\nglobal distillation is proposed to reconstruct the relationship between various\npixels and pass it from teacher to student detector. The performance of AFD is\nevaluated on two public aerial image benchmarks, and the evaluation results\ndemonstrate that AFD in object detection can attain the performance of other\nstate-of-the-art models while being efficient.\n
Chengzheng LiChunyan XuZhen CuiDan WangTong ZhangJian Yang
Xili WangZhengyin LIANGTao Liu
Haotian YangYigui SunYan DongRuimin YangZhoufeng Liu
Qingyun FangLin ZhangZhaokui Wang