TY - JOUR
T1 - YOLOv7-CBAM and DeepSORT with pixel grid analysis for Real-Time weed localization and Intra-Row density estimation in apple orchards
AU - Arthur, Lawrence
AU - Mahnan, Sadjad
AU - He, Long
AU - Hussain, Magni
AU - Heinemann, Paul
AU - Brunharo, Caio
N1 - Publisher Copyright:
© 2025 Elsevier B.V.
PY - 2025/12
Y1 - 2025/12
N2 - In precision weed management, accurate detection, localization, and density estimation of weeds are crucial for effective decision-making. However, complex settings such as apple orchards, crop canopies, and low-hanging branches obstruct traditional top-view camera systems, requiring a side-view camera configuration that often leads to partial weed visibility and occlusion overlaps, resulting in misclassification or tracking loss of weeds. To address these challenges, this study enhances the YOLOv7 segmentation model with a Convolutional Block Attention Module (CBAM) for improved feature extraction and real-time detection of weed species. We integrated the DeepSORT algorithm, leveraging its robust tracking capabilities with a dynamic Kalman filtering cross-line mechanism to minimize detection loss from occlusions across frames. The enhanced model achieved a mean Average Precision (mAP) of 84.9% for segmentation and 83.6% for localization, while tracking performance showed a Multiple Object Tracking Accuracy (MOTA) of 0.82, Multiple Object Tracking Precision (MOTP) of 0.78, and an Identification F1-score (IDF1) of 0.88, with only six identity switches. Additionally, a novel pixel grid-based method estimates weed density at 75%, 50%, and 25% mask coverage thresholds, delivering a detailed and actionable assessment of the weed severity baseline. The effective quantification and enhanced detection and tracking capabilities of the model imply that precision weed management decisions in apple orchards can be significantly improved.
AB - In precision weed management, accurate detection, localization, and density estimation of weeds are crucial for effective decision-making. However, complex settings such as apple orchards, crop canopies, and low-hanging branches obstruct traditional top-view camera systems, requiring a side-view camera configuration that often leads to partial weed visibility and occlusion overlaps, resulting in misclassification or tracking loss of weeds. To address these challenges, this study enhances the YOLOv7 segmentation model with a Convolutional Block Attention Module (CBAM) for improved feature extraction and real-time detection of weed species. We integrated the DeepSORT algorithm, leveraging its robust tracking capabilities with a dynamic Kalman filtering cross-line mechanism to minimize detection loss from occlusions across frames. The enhanced model achieved a mean Average Precision (mAP) of 84.9% for segmentation and 83.6% for localization, while tracking performance showed a Multiple Object Tracking Accuracy (MOTA) of 0.82, Multiple Object Tracking Precision (MOTP) of 0.78, and an Identification F1-score (IDF1) of 0.88, with only six identity switches. Additionally, a novel pixel grid-based method estimates weed density at 75%, 50%, and 25% mask coverage thresholds, delivering a detailed and actionable assessment of the weed severity baseline. The effective quantification and enhanced detection and tracking capabilities of the model imply that precision weed management decisions in apple orchards can be significantly improved.
UR - https://www.scopus.com/pages/publications/105018173277
UR - https://www.scopus.com/inward/citedby.url?scp=105018173277&partnerID=8YFLogxK
U2 - 10.1016/j.compag.2025.111071
DO - 10.1016/j.compag.2025.111071
M3 - Article
AN - SCOPUS:105018173277
SN - 0168-1699
VL - 239
JO - Computers and Electronics in Agriculture
JF - Computers and Electronics in Agriculture
M1 - 111071
ER -