JOURNAL ARTICLE

Knowledge distillation and student–teacher learning for weed detection in turf

Danlan ZhaiTeng LiuFeiyu HeJinxu WangXiaojun JinJialin Yu

Year: 2024 Journal:   Weed Science Vol: 72 (6)Pages: 804-815   Publisher: Cambridge University Press

Abstract

Abstract Machine vision–based herbicide applications relying on object detection or image classification deep convolutional neural networks (DCNNs) demand high memory and computational resources, resulting in lengthy inference times. To tackle these challenges, this study assessed the effectiveness of three teacher models, each trained on datasets of varying sizes, including D-20k (comprising 10,000 true-positive and true-negative images) and D-10k (comprising 5,000 true-positive and true-negative images). Additionally, knowledge distillation was performed on their corresponding student models across a range of temperature settings. After the process of student–teacher learning, the parameters of all student models were reduced. ResNet18 not only achieved higher accuracy (ACC ≥ 0.989) but also maintained higher frames per second (FPS ≥ 742.9) under its optimal temperature condition ( T = 1). Overall, the results suggest that employing knowledge distillation in the machine vision models enabled accurate and reliable weed detection in turf while reducing the need for extensive computational resources, thereby facilitating real-time weed detection and contributing to the development of smart, machine vision–based sprayers.

Keywords:
Weed Distillation Environmental science Mathematics education Psychology Chemistry Biology Agronomy Chromatography

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.35
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Diverse Educational Innovations Studies
Life Sciences →  Agricultural and Biological Sciences →  General Agricultural and Biological Sciences
Animal and Plant Science Education
Social Sciences →  Psychology →  Social Psychology
© 2026 ScienceGate Book Chapters — All rights reserved.