Knowledge distillation has been proven to be an important component in unsupervised image anomaly detection and localization. However, a significant challenge remains in how the student network can extract more accurate feature information from the teacher network during the knowledge distillation process. In this paper, we propose a refined knowledge distillation network which introduces simulated anomalous images to address the lack of anomalous samples in unsupervised anomaly detection. Firstly, to enhance the student network’s ability to effectively focus on the important features of the teacher network while ensuring spatial consistency between student and teacher features, we design a double feature enhancement and alignment module (DFEAM). Additionally, we propose a multi-domain enhanced restoration loss to reduce the loss of critical details while the student network is learning from the teacher network, thereby improving its feature learning capability. Finally, to fully consider the weight differences between features at different layers for various detection tasks, we design a multi-scale feature interaction fusion module (MFIFM) that effectively integrates feature information from different layers to achieve better results. Extensive experiments are conducted on the publicly available industrial anomaly detection dataset, and the results validate the effectiveness of our model and modules. The image-level anomaly detection achieves an area under the curve (AUC) of 98.9%, the pixel-level anomaly detection achieves an average precision (AP) of 78.3%, and the instance-level anomaly detection achieves an instance average precision (IAP) of 77.5%.
Muhao XuCuiping ZhuGuang FengSijie Niu
Xiaoming WangYongxiong WangZhiqun PanGuangpeng Wang
Hongbo LiuKai LiXiu LiYulun Zhang
Mohammad Mehedi HasanNaigong YuImran Khan Mirani