Aiming at the problems of low recognition accuracy and high miss-detection and misdetection rates encountered in the task of helmet wear detection for site operations, this study proposes an improved YOLOv7-tiny target detection algorithm for enhancing the performance of helmet wear detection. The algorithm employs the Mosaic method to randomly splice the input images, and subsequently realizes efficient enhancement of the data by performing random rotation, scaling and cropping operations on the spliced images. To enhance the stability of the model, we replace the coordinate loss in the YOLOv7-tiny network with WIoU (Wise-IoU) loss. To enhance the fusion effect of the feature layer and to solve the problem of gradient vanishing in the deep network, we introduce the RepBlock layer in the original network. Experiments on the publicly available helmet detection dataset show that the improved YOLOv7-tiny target detection algorithm not only makes the convergence of the loss function more stable compared to the original network, but also improves the detection accuracy to 94.2%, which is an improvement of 18.4 percentage points compared to the original network; at the same time, the mAP is also significantly improved to 92.8%. In addition, the leakage false detection is also effectively reduced, thus proving the significant advantage of the improved algorithm over the original network in terms of detection performance.
Dong MaC. Y. David YangLegan AoShi BaoShaoying Ma
Cong LiuZhiyong HongWenhua YuDexin Zhen
Shuqiang WangPeiyang WuQingqing Wu
Xianyan KuangYing LuoLei Huixianglan huan