Content area

Abstract

Agricultural robots are rapidly becoming more advanced with the development of relevant technologies and in great demand to guarantee food supply. As such, they are slated to play an important role in precision agriculture. For tomato production, harvesting employs over 40% of the total workforce. Therefore, it is meaningful to develop a robot harvester to assist workers. The objective of this work is to understand the factors restricting the recognition accuracy using image processing and deep learning methods, and improve the performance of crop detection in agricultural complex environment. With the accurate recognition of the growing status and location of crops, temporal management of the crop and selective harvesting can be available, and issues caused by the growing shortage of agricultural labour can be alleviated. In this respect, this work integrates the classic image processing methods with the YOLOv5 (You only look once version 5) network to increase the accuracy and robustness of tomato and stem perception. As a consequence, an algorithm to estimate the degree of maturity of truss tomatoes (clusters of individual tomatoes) and an integrated method to locate stems based on the resultant experiments error of each individual method were proposed. Both indoor and real-filed tests were carried out using a robot harvester. The results proved the high accuracy of the proposed algorithms under varied illumination conditions, with an average deviation of 2 mm from the ground-truth. The robot can be guided to harvest truss tomatoes efficiently, with an average operating time of 9 s/cluster.

Details

Title
Efficient tomato harvesting robot based on image processing and deep learning
Author
Miao, Zhonghua 1 ; Yu, Xiaoyou 1 ; Li, Nan 1 ; Zhang, Zhe 1 ; He, Chuangxin 1 ; Li, Zhao 1 ; Deng, Chunyu 1 ; Sun, Teng 1   VIAFID ORCID Logo 

 Shanghai University, Intelligent Equipment and Robotics Lab, Department of Automation, School of Mechatronic Engineering and Automation, Shanghai, China (GRID:grid.39436.3b) (ISNI:0000 0001 2323 5732) 
Pages
254-287
Publication year
2023
Publication date
Feb 2023
Publisher
Springer Nature B.V.
ISSN
13852256
e-ISSN
15731618
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2766898246
Copyright
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022. Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.