Full Text

Turn on search term navigation

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Walnut shell–kernel separation is an essential step in the deep processing of walnut. It is a crucial factor that prevents the increase in the added value and industrial development of walnuts. This study proposes a walnut shell–kernel detection method based on YOLOX deep learning using machine vision and deep-learning technology to address common issues, such as incomplete shell–kernel separation in the current airflow screening, high costs and the low efficiency of manually assisted screening. A dataset was produced using Labelme by acquiring walnut shell and kernel images following shellshock. This dataset was transformed into the COCO dataset format. Next, 110 epochs of training were performed on the network. When the intersection over the union threshold was 0.5, the average precision (AP), the average recall rate (AR), the model size, and floating point operations per second were 96.3%, 84.7%, 99 MB, and 351.9, respectively. Compared with YOLOv3, Faster Region-based Convolutional Neural Network (Faster R-CNN), and Single Shot MultiBox Detector algorithms (SSD), the AP value of the proposed algorithm was increased by 2.1%, 1.3%, and 3.4%, respectively. Similarly, the AR was increased by 10%, 2.3%, and 9%, respectively. Meanwhile, walnut shell–kernel detection was performed under different situations, such as distinct species, supplementary lighting, or shielding conditions. This model exhibits high recognition and positioning precision under different walnut species, supplementary lighting, and shielding conditions. It has high robustness. Moreover, the small size of this model is beneficial for migration applications. This study’s results can provide some technological references to develop faster walnut shell–kernel separation methods.

Details

Title
Machine Vision-Based Chinese Walnut Shell–Kernel Recognition and Separation
Author
Zhang, Yongcheng 1 ; Wang, Xingyu 2   VIAFID ORCID Logo  ; Liu, Yang 2 ; Li, Zhanbiao 2   VIAFID ORCID Logo  ; Lan, Haipeng 2 ; Zhang, Zhaoguo 3 ; Ma, Jiale 2 

 Modern Agricultural Engineering Key Laboratory at Universities of Education Department of Xinjiang Uygur Autonomous Region, Tarim University, Alaer 843300, China; [email protected] (Y.Z.); [email protected] (X.W.); [email protected] (Y.L.); [email protected] (Z.L.); [email protected] (H.L.); [email protected] (Z.Z.); College of Mechanical Electrification Engineering, Tarim University, Alaer 843300, China; College of Engineering, Huazhong Agricultural University, Wuhan 430070, China 
 Modern Agricultural Engineering Key Laboratory at Universities of Education Department of Xinjiang Uygur Autonomous Region, Tarim University, Alaer 843300, China; [email protected] (Y.Z.); [email protected] (X.W.); [email protected] (Y.L.); [email protected] (Z.L.); [email protected] (H.L.); [email protected] (Z.Z.); College of Mechanical Electrification Engineering, Tarim University, Alaer 843300, China 
 Modern Agricultural Engineering Key Laboratory at Universities of Education Department of Xinjiang Uygur Autonomous Region, Tarim University, Alaer 843300, China; [email protected] (Y.Z.); [email protected] (X.W.); [email protected] (Y.L.); [email protected] (Z.L.); [email protected] (H.L.); [email protected] (Z.Z.); Faculty of Modern Agricultural Engineering, Kunming University of Science and Technology, Kunming 650500, China 
First page
10685
Publication year
2023
Publication date
2023
Publisher
MDPI AG
e-ISSN
20763417
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2876479636
Copyright
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.