Automatic weed quantification in potato crops based on a modified convolutional neural network using drone images

Authors

DOI:

https://doi.org/10.56294/dm2025194

Keywords:

weed quantification, Deep Learning, UAV images, Semantic segmentation, CNN, ResNeXt50

Abstract

Identifying and quantifying weeds is a crucial aspect of agriculture for efficiently controlling them. Weeds compete with the crop for nutrients, minerals, physical space, sunlight, and water, causing problems in crops ranging from low production to economic losses and environmental deterioration of the land. Weed quantification is generally a manual process requiring significant time and precision. Convolutional Neural Networks (CNN) are very common in weed quantification. Thus, the purpose of this research is the adaptation of the ResNeXt50 CNN architecture for semantic segmentation tasks, focused on the automatic quantification of weeds (Broadleaf dock, Dandelion, Kikuyo grass, and other unidentified classes) in potato fields using RGB images acquired by the DJI Mavic 2 Pro drone. The analytical model was trained following the Knowledge Discovery in Databases (KDD) methodology using Python and the TensorFlow-Keras frameworks. The results indicate that the modified ResNeXt50 model presented a mean IoU of 0.7350, a performance comparable to the values reported by other authors considering fewer weed classes. The Student´s t-test and Pearson correlation coefficient were applied to contrast the weed coverage from the model predictions and the ground truth, indicating no statistically significant differences between both measurements in most weed classes.

References

1. United Nations. THE 17 GOALS | Sustainable Development 2018. https://sdgs.un.org/goals (accedido 21 de septiembre de 2024).

2. Babaei-Ghaghelestany A, Alebrahim MT, Farzaneh S, Mehrabi M. The anticancer and antibacterial properties of aqueous and methanol extracts of weeds. J Agric Food Res 2022;10:100433. https://doi.org/10.1016/j.jafr.2022.100433.

3. Paušič A, Tojnko S, Lešnik M. Permanent, undisturbed, in-row living mulch: A realistic option to replace glyphosate-dominated chemical weed control in intensive pear orchards. Agric Ecosyst Environ 2021;318:107502. https://doi.org/10.1016/j.agee.2021.107502.

4. Zimdahl RL. Weed Reproduction and Dispersal. Fundam. Weed Sci., Elsevier; 2018, p. 83-121. https://doi.org/10.1016/B978-0-12-811143-7.00005-6.

5. Gao J, Liao W, Nuyttens D, Lootens P, Xue W, Alexandersson E, et al. Cross-domain transfer learning for weed segmentation and mapping in precision farming using ground and UAV images. Expert Syst Appl 2024;246:122980. https://doi.org/10.1016/j.eswa.2023.122980.

6. Coulibaly S, Kamsu-Foguem B, Kamissoko D, Traore D. Deep Convolution Neural Network sharing for the multi-label images classification. Mach Learn Appl 2022;10:100422. https://doi.org/10.1016/j.mlwa.2022.100422.

7. Chacua B, García I, Rosero P, Suárez L, Ramírez I, Simbaña Z, et al. People Identification through Facial Recognition using Deep Learning. 2019 IEEE Lat. Am. Conf. Comput. Intell. -CCI, 2019, p. 1-6. https://doi.org/10.1109/LA-CCI47412.2019.9037043.

8. Montenegro S, Pusdá-Chulde M, Caranqui-Sánchez V, Herrera-Tapia J, Ortega-Bustamante C, García-Santillán I. Android Mobile Application for Cattle Body Condition Score Using Convolutional Neural Networks. En: Narváez FR, Urgilés F, Bastos-Filho TF, Salgado-Guerrero JP, editores. Smart Technol. Syst. Appl., vol. 1705, Cham: Springer Nature Switzerland; 2023, p. 91-105. https://doi.org/10.1007/978-3-031-32213-6_7.

9. Cevallos M, Sandoval-Pillajo L, Caranqui-Sánchez V, Ortega-Bustamante C, Pusdá-Chulde M, García-Santillán I. Morphological Defects Classification in Coffee Beans Based on Convolutional Neural Networks. En: Valencia-García R, Borodulina T, Del Cioppo-Morstadt J, Moran-Castro CE, Vera-Lucio N, editores. Technol. Innov., vol. 2276, Cham: Springer Nature Switzerland; 2025, p. 3-15. https://doi.org/10.1007/978-3-031-75702-0_1.

10. Ulloa F, Sandoval-Pillajo L, Landeta-López P, Granda-Peñafiel N, Pusdá-Chulde M, García-Santillán I. Identification of Diabetic Retinopathy from Retinography Images Using a Convolutional Neural Network. En: Valencia-García R, Borodulina T, Del Cioppo-Morstadt J, Moran-Castro CE, Vera-Lucio N, editores. Technol. Innov., vol. 2276, Cham: Springer Nature Switzerland; 2025, p. 121-36. https://doi.org/10.1007/978-3-031-75702-0_10.

11. Salazar-Fierro F, Cumbal C, Trejo-España D, León-Fernández C, Pusdá-Chulde M, García-Santillán I. Detection of Scoliosis in X-Ray Images Using a Convolutional Neural Network. En: Valencia-García R, Borodulina T, Del Cioppo-Morstadt J, Moran-Castro CE, Vera-Lucio N, editores. Technol. Innov., vol. 2276, Cham: Springer Nature Switzerland; 2025, p. 167-83. https://doi.org/10.1007/978-3-031-75702-0_13.

12. Valizadeh M, Wolff SJ. Convolutional Neural Network applications in additive manufacturing: A review. Adv Ind Manuf Eng 2022;4:100072. https://doi.org/10.1016/j.aime.2022.100072.

13. Espejo-Garcia B, Mylonas N, Athanasakos L, Fountas S, Vasilakoglou I. Towards weeds identification assistance through transfer learning. Comput Electron Agric 2020;171:105306. https://doi.org/10.1016/j.compag.2020.105306.

14. McCool C, Perez T, Upcroft B. Mixtures of Lightweight Deep Convolutional Neural Networks: Applied to Agricultural Robotics. IEEE Robot Autom Lett 2017;2:1344-51. https://doi.org/10.1109/LRA.2017.2667039.

15. Pusdá-Chulde MR, Salazar-Fierro FA, Sandoval-Pillajo L, Herrera-Granda EP, García-Santillán ID, De Giusti A. Image Analysis Based on Heterogeneous Architectures for Precision Agriculture: A Systematic Literature Review. En: Nummenmaa J, Pérez-González F, Domenech-Lega B, Vaunat J, Oscar Fernández-Peña F, editores. Adv. Appl. Comput. Sci. Electron. Ind. Eng., vol. 1078, Cham: Springer International Publishing; 2020, p. 51-70. https://doi.org/10.1007/978-3-030-33614-1_4.

16. García-Santillán ID, Pajares G. On-line crop/weed discrimination through the Mahalanobis distance from images in maize fields. Biosyst Eng 2018;166:28-43. https://doi.org/10.1016/j.biosystemseng.2017.11.003.

17. Zou K, Chen X, Zhang F, Zhou H, Zhang C. A Field Weed Density Evaluation Method Based on UAV Imaging and Modified U-Net. Remote Sens 2021;13:310. https://doi.org/10.3390/rs13020310.

18. Osorio Delgado AK. Método para la estimación de maleza en cultivos de lechuga utilizando aprendizaje profundo e imágenes multiespectrales. Trabajo de grado - Maestría. Universidad Nacional de Colombia, 2021.

19. Puerto Lara AE. Clasificación y cuantificación de maleza en cultivos de hortalizas por medio de procesamiento de imágenes digitales multiespectrales. Universidad Nacional de Colombia, 2018.

20. Xie S, Girshick R, Dollar P, Tu Z, He K. Aggregated Residual Transformations for Deep Neural Networks. 2017 IEEE Conf. Comput. Vis. Pattern Recognit. CVPR, Honolulu, HI: IEEE; 2017, p. 5987-95. https://doi.org/10.1109/CVPR.2017.634.

21. Fayyad U, Piatetsky-Shapiro G, Smyth P. From Data Mining to Knowledge Discovery in Databases. AI Mag 1996;17:37-37. https://doi.org/10.1609/aimag.v17i3.1230.

22. Cai Y, Zeng F, Xiao J, Ai W, Kang G, Lin Y, et al. Attention-aided semantic segmentation network for weed identification in pineapple field. Comput Electron Agric 2023;210:107881. https://doi.org/10.1016/j.compag.2023.107881.

23. Nong C, Fan X, Wang J. Semi-supervised Learning for Weed and Crop Segmentation Using UAV Imagery. Front Plant Sci 2022;13:927368. https://doi.org/10.3389/fpls.2022.927368.

24. Shahi TB, Dahal S, Sitaula C, Neupane A, Guo W. Deep Learning-Based Weed Detection Using UAV Images: A Comparative Study. Drones 2023;7:624. https://doi.org/10.3390/drones7100624.

25. Sarvini T, Sneha T, Sukanya Gowthami G, Sushmitha S, Kumaraswamy R. Performance Comparison of Weed Detection Algorithms. 2019 Int. Conf. Commun. Signal Process. ICCSP, Chennai, India: IEEE; 2019, p. 0843-7. https://doi.org/10.1109/ICCSP.2019.8698094.

26. Roboflow Docs. Create a Project | Roboflow Docs 2024. https://docs.roboflow.com/datasets/create-a-project (accedido 28 de enero de 2024).

27. He K, Zhang X, Ren S, Sun J. Deep Residual Learning for Image Recognition. 2016 IEEE Conf. Comput. Vis. Pattern Recognit. CVPR, Las Vegas, NV, USA: IEEE; 2016, p. 770-8. https://doi.org/10.1109/CVPR.2016.90.

28. Mou L, Zhu XX. Vehicle Instance Segmentation From Aerial Image and Video Using a Multitask Learning Residual Fully Convolutional Network. IEEE Trans Geosci Remote Sens 2018;56:6699-711. https://doi.org/10.1109/TGRS.2018.2841808.

29. Li X, Duan F, Hu M, Hua J, Du X. Weed Density Detection Method Based on a High Weed Pressure Dataset and Improved PSP Net. IEEE Access 2023;11:98244-55. https://doi.org/10.1109/ACCESS.2023.3312191.

30. Hu X-Z, Jeon W-S, Rhee S-Y. Sugar Beets and Weed Detection using Semantic Segmentation. 2022 Int. Conf. Fuzzy Theory Its Appl. IFUZZY, Kaohsiung, Taiwan: IEEE; 2022, p. 1-4. https://doi.org/10.1109/iFUZZY55320.2022.9985222.

31. Lottes P, Behley J, Milioto A, Stachniss C. Fully Convolutional Networks With Sequential Information for Robust Crop and Weed Detection in Precision Farming. IEEE Robot Autom Lett 2018;3:2870-7. https://doi.org/10.1109/LRA.2018.2846289.

32. Weyler J, Läbe T, Magistri F, Behley J, Stachniss C. Towards Domain Generalization in Crop and Weed Segmentation for Precision Farming Robots. IEEE Robot Autom Lett 2023;8:3310-7. https://doi.org/10.1109/LRA.2023.3262417.

33. Sa I, Chen Z, Popovic M, Khanna R, Liebisch F, Nieto J, et al. weedNet: Dense Semantic Weed Classification Using Multispectral Images and MAV for Smart Farming. IEEE Robot Autom Lett 2018;3:588-95. https://doi.org/10.1109/LRA.2017.2774979.

34. Gonçalves ÉC, Almeida GPD, Silva ELD, Schein TT, Evald PJDDO, Drews-Jr PLJ. Line Detection and Segmentation of Annual Crops Using Hybrid Method. 2023 Lat. Am. Robot. Symp. LARS 2023 Braz. Symp. Robot. SBR 2023 Workshop Robot. Educ. WRE, Salvador, Brazil: IEEE; 2023, p. 472-7. https://doi.org/10.1109/LARS/SBR/WRE59448.2023.10332920.

35. Ullah HS, Asad MH, Bais A. End to End Segmentation of Canola Field Images Using Dilated U-Net. IEEE Access 2021;9:59741-53. https://doi.org/10.1109/ACCESS.2021.3073715.

36. Sossa J, Rodríguez R. Procesamiento y Análisis Digital de Imágenes. 1.a ed. 430: RA-MA; 2011.

37. Misra D. Mish: A Self Regularized Non-Monotonic Activation Function 2019. https://doi.org/10.48550/ARXIV.1908.08681.

38. Smith LN. Cyclical Learning Rates for Training Neural Networks 2015. https://doi.org/10.48550/ARXIV.1506.01186.

39. Loshchilov I, Hutter F. SGDR: Stochastic Gradient Descent with Warm Restarts 2017.

40. Celikkan E, Saberioon M, Herold M, Klein N. Semantic Segmentation of Crops and Weeds with Probabilistic Modeling and Uncertainty Quantification. 2023 IEEECVF Int. Conf. Comput. Vis. Workshop ICCVW, 2023, p. 582-92. https://doi.org/10.1109/ICCVW60793.2023.00065.

41. Juma A, Rodríguez J, Caraguay J, Naranjo M, Quiña-Mera A, García-Santillán I. Integration and Evaluation of Social Networks in Virtual Learning Environments: A Case Study. En: Botto-Tobar M, Pizarro G, Zúñiga-Prieto M, D’Armas M, Zúñiga Sánchez M, editores. Technol. Trends, vol. 895, Cham: Springer International Publishing; 2019, p. 245-58. https://doi.org/10.1007/978-3-030-05532-5_18.

42. Lind D, Marchal W, Wathen S. Basic Statistics in Business and Economics. 10.a ed. MCGraw Hill; 2022.

43. Sapkota R, Stenger J, Ostlie M, Flores P. Towards reducing chemical usage for weed control in agriculture using UAS imagery analysis and computer vision techniques. Sci Rep 2023;13:6548. https://doi.org/10.1038/s41598-023-33042-0.

Downloads

Published

2025-02-13

Issue

Section

Original

How to Cite

1.
Vinueza K, Sandoval-Pillajo L, Giret-Boggino A, Trejo-España D, Pusdá-Chulde M, García-Santillán I. Automatic weed quantification in potato crops based on a modified convolutional neural network using drone images. Data and Metadata [Internet]. 2025 Feb. 13 [cited 2025 Apr. 27];4:194. Available from: https://dm.ageditor.ar/index.php/dm/article/view/194