1. Introduction
As a domesticated poultry type with a long history, broilers (broiler chickens) are highly popular globally. Broilers can provide economical and nutritious eggs. Additionally, broilers have advantages such as high protein content, low fat levels, and low calorie counts that cannot be compared to pork and beef [1]. In recent years, the development of flat farming at the scale for broilers has been driven by the increasing demand for low-fat and high-protein broilers [2]. The flat breeding mode refers to directly raising a flock of broilers on the ground or a floor composed of mesh [3]. Broilers raised in this mode have full mobility, high bone strength, and good meat quality [4].
Identifying and clearing dead broilers is a time-consuming and laborious task. Moreover, a large amount of auditory input and cognitive information processing will compete with the visual search, leading to a decrease in attention when performing a single repetitive task [5]. Secondly, the breeding environment contains a large amount of gasses that are harmful to humans, such as NH3, H2S, and CO [6], as well as a large amount of dust [7]. Both of these factors are detrimental to human health, particularly for individuals who work in these environments. Finally, as the inspections have to be performed between different broiler coops [8], the risks of cross-infection between different broiler coops and disease transmission between poultry and mammals will be increased when identifying and cleaning the dead broilers manually. Therefore, intelligent dead broiler removal devices can be used to mitigate the abovementioned drawbacks.
The early identification of dead broilers mainly relies on traditional methods through features such as broiler voice, body temperature, and posture [9,10,11,12]. With the continuous advancement of technology, deep learning and visual technologies have been implemented to identify the conditions of livestock and poultry [13]. Visual technology has the advantages of high efficiency and can identify the conditions of broilers without touching them [14]. Deep learning technology learns through massive amounts of data [15], further improving the accuracy and efficiency of recognition.
A number of studies have proposed methods to recognize dead broilers based on the different characteristics of chickens. Lu et al. monitored the colors of chicken crowns using machine vision technology to identify dead broilers [16]; however, this required three consecutive even shakes of the coop, which could cause stress behaviors in yellow-feathered broilers. Zheng et al. used machine vision to judge the posture of feeding laying hens to realize the classification of sick and dead hens [17]; however, this method is prone to misidentification when the posture of healthy hens is abnormal or the camera capture position is shifted. Qu et al. proposed an algorithm based on LibSVM visual detection of dead broilers through judging the morphological features of the chicken claws [18], but this detection method overly relied on the chicken claws and was prone to misjudgment. Veera et al. identified dead broilers using infrared thermal images and the contours of broilers [19], but the algorithm was not suitable for the classification of day-old (after nine weeks) chickens or when there was a high density of dead chickens. All of these studies have proven valuable for subsequent applications, but their dead chicken detection algorithms generally rely on the environment and a single chicken feature.
In terms of dead chicken removal devices, research and applications have shown great promise. In the 1980s, robots began to be deployed in the field [20] to perceive environmental information and achieve robotic movement. Liu et al. designed a visual technology-based dead broiler removal device [21]. The stainless-steel plates on both sides of the front end of the device “swept” the dead broilers onto the conveyor belt, which transported the dead broilers to the storage warehouse at the back end. However, the slow movement of the device made it difficult to clean up dead chickens from other locations of the flat chicken house. Hu et al. designed a dead broiler picking actuator with three joints and four fingers, based on the underactuated principle, for dead broilers raised in captivity for 3 to 7 weeks [22]. However, the device was more complex in terms of construction and had not been applied in the poultry industry. Zhao et al. conducted in-depth research on the kinematic characteristics of a five-degrees-of-freedom dead broiler-picking robot arm. Simulation analysis was conducted using the MATLAB 2020a software [23], which provided preliminary theoretical support for the design of subsequent control systems. However, this research was only at the theoretical level and did not yet include actual construction and experimental validation of the robotic arm.
Research on dead broilers has primarily concentrated on cage-raised broilers, with scant attention paid to the recognition of dead broilers in free-range scenarios. Furthermore, the ability to extract the features and details of dead chicken images is relatively low. In terms of devices, despite the relatively comprehensive theoretical exploration regarding the structures and devices for picking up dead broilers, the majority of studies have been confined to the theoretical level. In view of this, this study designed a simple, easy-to-use, and cost-effective dead broiler grasping and moving device, with an average success rate of 81.3%. In addition, this study proposed an enhanced deep learning method for recognizing dead broilers based on YOLOv6n (hereinafter referred to as YOLOv6), solving the problems of missing image details and missed detection in dense situations.
2. Materials and Methods
2.1. Experimental Base
The experiment was performed in a large-scale free-range yellow-feather broiler farm at Jinniuhu Street, Luhe District, Nanjing City, Jiangsu Province, China (118.52.38° E, 32.26.54° N). Each broiler was about 15 weeks old, weighing about 1.45 kg, and had a chest width of about 12.95 cm. The interior scene of the experimental broiler house is shown in Figure 1. The data were collected from 8 March to 28 March 2023, and the testing experiment was conducted from 8 March to 12 March 2024.
2.2. Moving Chassis
The mobile chassis was a R550 (AKM) PLUS chassis (Wheel Technology Co., LTD., Dongguan, China) with a depth camera and suspended Ackerman structure carrying a laser radar, and it could realize the functions of mapping navigation, obstacle avoidance, sound source location, wireless communication with the mechanical arm, and image acquisition as well as a processing subsystem. Figure 2 shows the structure of the dead broiler identification and grabbing device.
2.3. Image Acquisition and Processing System
The image acquisition and processing subsystem is composed of a binocular camera and Jetson Nano (NVIDIA, Santa Clara, CA, USA). It identifies dead broilers and their coordinates in scattered broiler farms in real-time via binocular vision positioning. The camera has 5 million pixels with a video resolution of 1920 × 1080. The Jetson Nano is a compact and feature-rich AI computing module developed by NVIDIA, equipped with a 128-core Maxwell GPU [24] and embedded with a trained recognition model for dead broilers. Figure 3 shows the image acquisition and processing system.
Binocular vision positioning utilizes the principle of stereo vision to construct a three-dimensional model of the scene, thereby determining the specific position of the target in space [25]. The imaging model is shown in Figure 4.
Equation (1) was derived from similar triangles.
(1)
where Z is the normal distance from the chicken to the baseline (b) of the camera, X refers to the lateral distance from the chicken to the center of the camera, f refers to the camera focal length, and d is the binocular camera parallax, with a numerical value of μL + μR.2.4. Mechanical Arm
The dead broiler grasping manipulator was based on the Dobot manipulator (Dobot Technology Co., LTD., Shenzhen, China), and the model and parameters of the manipulator and motor were improved according to the specifications of broilers. The material was 3D-printed resin, and the control system communicated with Jetson Nano through Bluetooth for the sake of data transmission and instruction interaction. The maximum load was 5 kg, and the repetitive positioning was 0.2 mm. Physical diagrams of the manipulator and the end effector are shown in Figure 5.
2.5. Camera Calibration and Hand–Eye Calibration
In order to realize the transformation from the real-world coordinate system to the manipulator’s coordinate system, camera calibration and hand–eye calibration were needed [26]. According to the principle of camera imaging, the camera converts the actual 3D chicken image into 2D information to infer three-dimensional chicken information from two-dimensional chicken image information, it was necessary to calibrate the camera and locate the dead chicken’s spatial position. In addition, the mobile chassis and robotic arm control system functioned based on the position of the dead chicken, while the binocular camera was not fixed on the robotic arm, and its position could also change. To ensure that the image information captured by the camera matched the device control system, it was necessary to calibrate the camera coordinate system and the robotic arm base coordinate system uniformly to achieve accurate visual positioning. This process is commonly referred to as hand–eye calibration, and the schematic diagram of the relationship between the three is shown in Figure 6.
In camera calibration and hand–eye calibration operations, calibration plates such as a checkerboard or solid circular arrays are usually used as auxiliary calibration objects. In this experiment, a checkerboard calibration plate was selected, as shown in Figure 7a. The checkerboard calibration board comprised a 10 × 7 checkerboard, where the size of each square was 26 mm × 26 mm. The camera calibration result is shown in Figure 7b.
In camera calibration, checkerboard or solid circular array calibration boards are usually used as auxiliary calibration objects [27]. In this experiment, a checkerboard calibration board was selected. The process was as follows: First, a world coordinate system was established on the calibration board, and the corners of the calibration board were touched by the end of the robotic arm, as shown in Figure 8. As the world coordinate system was manually set on the calibration board, the positions of these corners in the world coordinate system were known. We used the position information of the above corners to solve the transformation matrix between the world coordinate system ({World}) and the robotic arm coordinate system ({Base}) [26,28]. The transformation matrix from the robotic arm coordinate system to the world coordinate system is shown in Equation (2).
(2)
where is the rotation matrix and P is the translation transformation vector.and P can be represented by the position of the origin of the world coordinate system in the coordinate system of the robotic arm. The solution of the rotation matrix can be achieved by introducing a transition matrix, which translates the coordinate system of the robotic dead chicken grasping arm to coincide with the origin of the world coordinate system, as shown in Figure 9 [29].
The coordinate system of the robotic arm can obtain the transition matrix ({Transition}) through the translation vector, as shown in Equation (3). Since the origin of the transition coordinate system is consistent with the origin of the world coordinate system, when the end of the robotic arm contacts the origin, the specific position of the world coordinate system in the robotic arm coordinate system can be determined. The translation vector P can then be calculated.
(3)
where xb, yb, and zb are the three-dimensional coordinates in the {base} coordinate system, and xt, yt, and zt are the three-dimensional coordinates in the {tool} coordinate system.By rotating the R vector in the transition coordinate system, the coordinate system of the robotic arm can be obtained, as shown in Equation (4).
(4)
2.6. Data Collection and Image Data Set Production
The quality and quantity of data play a vital role in the effective application of deep learning algorithms. High-quality data can significantly improve the performance of deep learning models [30]. Dead yellow-feather broilers exhibit tightly closed eyes, bodies pressed to the ground, or weak, stiff bodies. The other behavior states of yellow-feather broilers in scattered broiler farms include eating, lying down, and walking [31]. The different behavior states of broilers are shown in Table 1.
Among these, “Dead” broilers need to be grabbed. “Others” need not be grabbed. A schematic of the behavior is shown in Figure 10.
After screening, 1565 original images were obtained. The images were labeled using LabelMe 3.16.7 with the label categories shown in Table 2 and converted to PASCAL VOC format for subsequent use. More than one behavior of yellow-feathered chickens could be seen in each image. After image preprocessing (resizing, random flipping, and translation transformation), 2456 images were obtained, which were divided into training, verification, and test sets according to a ratio of 8:1:1.
2.7. Recognition Model of Dead Yellow-Feather Broiler
Existing target detection algorithms based on deep learning are divided into single-stage or two-stage [32]. To select a suitable network model, three mainstream object detection algorithms—namely, SSD, Faster-RCNN, and YOLOv6—were trained and predicted using the same data set, and the model with the best comprehensive performance was selected. The deep learning methods used and their advantages and disadvantages are detailed in Table 3.
It can be seen from Table 3 that the Faster-RCNN model is more complex to train and is not easy to migrate to the image acquisition and processing system. However, YOLOv6 has the advantages of being lightweight, easy to migrate, and having low requirements for the development environment [33]. Therefore, the network model for detecting dead broilers was improved based on the YOLOv6 model.
The Squeeze-and-Excitation (SE) attention mechanism is a lightweight module designed to enhance the representational power of convolutional neural networks, as shown in Figure 11. It improves the network’s characterization ability by explicitly capturing the correlation between convolution channels [34].
The improved YOLOv6 network model is shown in Figure 12. The SE compresses feature maps with a scale of H × W × C through global average pooling, retaining only the size of C at the channel scale, which is converted to 1 × 1 × C. After compression, ReLU and restoration are performed and, finally, sigmoid is used for activation. The values of each channel are converted into weight values and multiplied, with the original input features as input features.
2.7.1. Model Training Parameters
The hardware platform used for this training was configured with a Tesla V100 graphics card with 24 G of memory and an AMD EPYC 9654 CPU. The Python version is 3.8, and the PyTorch framework version is 1.9.0.
2.7.2. Model Evaluation Metrics
When evaluating the performance of deep learning algorithms, the following key indicators are used: precision, which is used to measure the proportion of real positive samples in instances where the model predicts positive samples; recall, which reflects the ability of the model to find all positive samples; and mean average precision (maP), which reflects the performance of the comprehensive evaluation model in various categories of multi-category classification tasks. The F1 score is the harmonic mean of precision and recall. Together, these indicators constitute a standard system for comprehensively evaluating the performance of deep learning algorithms [35]. Their formulas are shown in Equations (5)–(8):
(5)
(6)
(7)
(8)
where TP is the number of true positive samples, FP is the number of false positive samples, FN is the number of false negatives, N represents the number of target classes detected, A is the corresponding accuracy, and pi is the change value of the recall.The FPS (frame rate per second of the screen) is also used to measure the processing speed of the model in practical applications [36]. The larger the FPS value, the faster the model detection speed.
2.8. The Design of the Dead Broiler Grasping Manipulator
The Robotics Toolbox module in MATLAB provides powerful functions that can be used to simulate the kinematics and trajectory planning of the manipulator [37]. The manipulator model was established, and the posture of the end effector of the manipulator in the base coordinate system was calculated by the “forward_kinematics” function [38]. The D-H parameters of the manipulator were constructed, and the simulation model of the manipulator is shown in Figure 13.
Path planning for the manipulator was realized via the quintic polynomial interpolation method. The rationality of the quintic polynomial interpolation method was verified by Robotics Toolbox, and the resulting diagram is shown in Figure 14.
3. Results and Analysis
3.1. The Different Model’s Performance
After the model was trained and the parameters were adjusted, the final evaluation was conducted using the test set. The overall results for the category of yellow-feathered broilers using the three algorithms SSD, Faster-RCNN, and YOLOv6 are shown in Table 4 and Figure 15. The standard errors of Faster-RCNN and YOLOv6 are shown in Table 5.
It can be seen from Table 4 and Table 5 and Figure 15 that the detection accuracy of YOLOv6 is the same as that of Faster-RCNN. However, the YOLOv6 has lower standard errors of precision and recall, which means that YOLOv6 has a low degree of dispersion. The YOLOv6 and SSD algorithms show obvious advantages in terms of running speed, which are related to their one-stage models [39,40], among which YOLOv6 has the fastest detection speed, as shown in Table 3.
Although YOLOv6 achieves good detection, YOLOv6 cannot identify broilers in the distance if the image background is complex, as shown in Figure 16. It can also be seen from Figure 16 that broiler state behavior in the upper left corner is not ideal, leading to misidentification.
We decided to optimize the YOLOv6 model to improve its performance. The rationality of the SE module was verified through ablation experiments, as shown in Table 6, and Figure 17 and Figure 18.
As can be seen from Table 6, Figure 17 and Figure 18, after adding the CBAM and SE attention modules to YOLOv6, different levels of improvement were yielded in maP, F1-score, precision, and recall. The model with the SE attention mechanism incorporated achieved the best results in the recognition of the death label category. Due to the great increase in the number of parameters after the introduction of the CBAM module, although the difference from the SE module in terms of detection accuracy was small, it was significantly less than the difference from the more lightweight SE module in terms of operation speed.
In order to enable the YOLOv6 + SE network model to deal with larger pictures and videos of henhouse scenes, and to improve the global perception, this study improved YOLOv6 + SE based on ASPP (Atrous Spatial Pyramid Pooling). ASPP leverages atrous convolutions with different dilation rates to extract multi-scale features without losing resolution or significantly increasing the computational cost. Through applying parallel Atrous convolutions with varying dilation rates, ASPP can effectively aggregate contextual information at multiple scales. The performance comparison between original and improved modes is shown in Table 7 and Figure 19. The recognition result of the improved YOLOv6 + SE is shown in Figure 20.
Comparing Figure 20 with Figure 16, it can be seen that the improved YOLOv6 significantly increased the number of broilers identified in different behavior states, proving that the ASPP module is able to reduce a certain amount of the leakage in the detection of broiler states in complex backgrounds. At the same time, the introduction of ASPP can also improve the iteration speed and detection speed to a certain extent, which makes up for the problem of the increased number of parameters brought about by the attention mechanism. The curves of loss that the training models suffered during training are shown in Figure 21.
3.2. The Real-Time Detection Effect of the Model
The improved YOLOv6 network model was migrated to Jetson Nano, and real-time detection was performed by connecting a display screen. The result is shown in Figure 22.
The red box displays the results, the green box displays in the form of a video, and the yellow box shows the behavior states of different broilers detected and their detection time. Through this experiment, it can be determined that the improved network model was successfully migrated to Jetson Nano, and the real-time identification effect is good.
3.3. The Design of the Dead Broiler Grasping Experiment
In this experiment, first, a dead broiler was placed at a point where the mobile device could easily move to and grasp it. Secondly, before being turned on, the device was placed where it could observe the dead broiler. Finally, the device was turned on and operated. The effect of catching a dead broiler achieved by the device is shown in Figure 23.
During the experiment, it was found that there were differences in the parts of dead broilers that were grabbed, as shown in Figure 24. The success rate of the device depends on which body part is grabbed. Three body parts were examined in three groups: (a), (b), and (c). Group (a) showed grabbing by the back, recorded as experiment 1. Group (b) showed grabbing by the back and chest, marked as experiment 2. Group (c) showed grabbing by the chest, marked as experiment 3. After an experiment was finished, the position of the device was kept unchanged; the detailed data on the grabbing results are shown in Table 8.
The mobile device for identifying and grasping dead broilers completed the task of removing dead broilers with a success rate of over 77% and an average success rate of 81.3%. The success rate when grabbing dead broiler feet and other parts with small contact areas was the lowest. The success rate was reduced in areas where broilers were densely gathered, which may have been due to the failure of the device to collect information on dead broilers; however, the mobile device can disperse the broilers during its operation and capture the information required to complete the grabbing of dead broilers.
4. Conclusions
-
(1). The experimental results demonstrated that the mobile device developed for identifying and grasping dead broilers proved capable of fulfilling the task of removing dead broilers. It achieved a success rate of over 77%, with an average success rate of 81.3%. However, the success rate was lowest when grasping parts of the broilers such as the neck, feet, or other areas with small contact surfaces. Even though the presence of the mobile device itself exerted a certain influence on the success rate, this aspect will be the focus of future improvements. Additionally, the success rate of grasping deceased broilers decreased in densely populated areas, which can be attributed to the device’s limited ability to collect information on dead broilers. Nevertheless, the mobile device could disperse the broilers during movement, capture the necessary information, and complete the grasping task.
-
(2). This study proposes an enhanced deep learning-based approach for identifying broilers. The YOLOv6 algorithm, with its superior comprehensive performance, was selected as the basic network and underwent in-depth optimization. Specifically, a YOLOv6 network structure based on the SE attention mechanism and ASPP was proposed to address the existing issues found in broiler houses. The experimental outcomes indicated that the recognition accuracy of the improved algorithm model reached 86.1%.
-
(3). This study designed a mechanical arm for positioning and grasping dead broilers. A model joint simulation of the manipulator was conducted, and the motion trajectory was planned. The experimental results verified that the manipulator model passed the test, the transmission was stable, and the trajectory met the requirements, thereby providing the essential conditions for achieving stable grasping and attaining the design objective.
This study focused on the automatic identification and removal of dead broilers in large-scale flat-breeding yellow-feather broiler farms, aiming to develop a solution that combines vision technology and robotic arm control technology. In order to solve the challenge of dead broiler identification in complex environments, this study proposed a dead broiler detection algorithm with high accuracy, high speed, and easy portability. Compared with traditional machine learning methods, the algorithm achieved significant improvements in accuracy and real-time performance, ensuring that the speed requirements of the dead broiler cleaning process can be met.
In addition, this study independently developed a vision-based mobile device for dead broiler collection that successfully achieved the expected design goals and was capable of efficiently and rapidly identifying and disposing of dead broilers in large-scale free-range broiler farms. The device shows a modularized design, which not only facilitates future function expansion, maintenance, and system upgrades, but also improves the overall flexibility.
Despite these results, there are some limitations of this study, as follows:
(1). Limitations of applicability—The current study was tested and optimized mainly with respect to yellow-feathered broilers. Given the wide variety of broiler breeds available in the market, further verification of the applicability of the device for other breeds is required. If the recognition effect is found to be poor, a breed-specific image database needs to be established as a benchmark for recognition.
(2). Efficiency and energy-saving considerations—When there are multiple dead broilers in the coop at the same time, although the device is able to effectively detect and remove them, further research is needed into how to optimize path planning for a more efficient operation from the point of view of improving efficiency and energy use.
Future work will focus on improvements and refinements in both of these areas, in order to further increase the usefulness and adaptability of the system.
Conceptualization, C.X. and X.Z.; methodology, C.X. and X.Z.; validation, C.X., H.L. and X.Z.; formal analysis, C.X., H.L. and Y.L.; investigation, C.X., H.L., Y.L. and M.W.; data curation, C.X., H.L., Y.L., M.W. and W.L.; writing—original draft preparation, C.X., H.L., Y.L., M.W. and W.L.; writing—review and editing, S.W., W.Z., M.X. and X.Z.; visualization, C.X., H.L. and Y.L.; supervision, S.W. and X.Z.; project administration, S.W., W.Z., M.X. and X.Z.; funding acquisition, S.W. and X.Z. All authors have read and agreed to the published version of the manuscript.
Not applicable.
Not applicable.
Data are contained within the article.
We are thankful to Yungang Bai, Sunyuan Wang, and Zhilong Chen, who contributed to our field data collection and primary data analysis.
The authors declare no conflicts of interest.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 5. The dead broiler grasping manipulator: (a) mechanical arm backbone; (b) end effector.
Figure 10. Different behavior images of yellow-feather broilers: (a) walking; (b) pecking; (c) resting; (d) inactive; (e) dead.
Figure 14. The simulation result graph: (a) acceleration simulation results; (b) velocity simulation result; (c) position simulation result graph. P.s.: The blue line represents the movement of joint 1, the red line represents the movement of joint 2, the purple line represents the movement of joint 3, and the orange line represents the movement of joint 4.
Figure 15. Speed comparison of YOLOv6, SSD and Faster-RCNN algorithms detection models.
Figure 17. Comparison of the results of ablation experiments with different labeling categories.
Figure 17. Comparison of the results of ablation experiments with different labeling categories.
Figure 18. Speed comparison of YOLOv6, YOLOv6 + SE and YOLOv6 + CBAM algorithms detection models.
Figure 19. Speed comparison of YOLOv6, YOLOv6 + SE and Improved YOLOv6 + SE algorithms detection models.
Figure 21. Training loss curves of the improved algorithms: (a) SSD; (b) Faster-RCNN; (c) improved YOLOv6 + SE.
Figure 23. (a) Identifying the dead broiler. (b) The device is moving. (c) The device is grabbing the dead broiler. (d) The device is receiving the dead broiler. (e) The device is transporting the dead broiler.
Figure 24. The different parts grasped: (a) back grab; (b) back and chest grab; (c) chest grab.
Different behavioral definitions of yellow-feather broilers.
Behavior Classification | Classification Definition |
---|---|
Dead | The yellow-feather broiler’s eyes close, the body clings to the ground, or the body is weak and stiff. |
Others | This includes motions such as walking, pecking, inactivity, and resting. |
The definitions of categories for classification and the number of observations for each category.
Label Name and Number | Label Definition |
---|---|
Walking (20,864) | Actions such as standing, walking, and arranging feathers |
Pecking (15,648) | Yellow-feathered broilers with their heads touching the ground, troughs or water troughs, or tails cocked up |
Resting (10,432) | Yellow-feathered broilers lying on the ground |
Inactive (5216) | Yellow-feathered broilers lying on their backs with their bodies curled up or their tails drooping |
Dead (32,166) | Yellow-feathered broilers lying flat on the ground with their bodies in a rigid state |
Comparison of advantages and disadvantages of three algorithms.
Models | Exact Deep Learning Method | Pros | Cons |
---|---|---|---|
SSD | Single deep network for both object classification and localization using default boxes | Fast, simple, and effective for a wide range of object sizes | Less accurate on very small objects |
Faster-RCNN | Two-stage detector with a region proposal network (RPN) for generating candidate regions | High accuracy, flexible backbone networks, faster-than-previous R-CNN version | Slower than one-stage detectors, more complex to train |
YOLOv6 | Single-pass detector with efficient backbones and improved training techniques | Very fast, simple architecture, competitive accuracy | Limited information available, may struggle with small objects |
The overall object detection results of the category of yellow-feathered broilers under three models.
Models | Precision | Recall | F1 Score | maP |
---|---|---|---|---|
YOLOv6 | 0.80 | 0.81 | 0.80 | 0.86 |
SSD | 0.78 | 0.78 | 0.78 | 0.80 |
Faster-RCNN | 0.81 | 0.81 | 0.81 | 0.87 |
The standard errors of Faster-RCNN and YOLOv6.
Models | Standard Errors of Precision | Standard Errors of Recall | Standard Errors of F1 Score |
---|---|---|---|
YOLOv6 | 0.62% | 0.51% | 0.68% |
Faster-RCNN | 0.81% | 0.77% | 0.73% |
Comparison of results of ablation experiments for overall categories.
Models | Precision | Recall | F1 Score | maP |
---|---|---|---|---|
YOLOv6 + SE | 0.84 | 0.88 | 0.83 | 0.90 |
YOLOv6 + CBAM | 0.82 | 0.86 | 0.84 | 0.90 |
Comparison of improved and unimproved model results.
Models | Precision | Recall | F1 Score | maP |
---|---|---|---|---|
YOLOv6 + SE | 0.84 | 0.88 | 0.83 | 0.90 |
Improved YOLOv6 + SE | 0.86 | 0.89 | 0.87 | 0.92 |
Comparison of improved model results.
Experimental Serial Number | Grab Times | Success Rate |
---|---|---|
1 | 30 | 0.87 |
2 | 30 | 0.80 |
3 | 30 | 0.77 |
References
1. Zhou, S.; Watcharaanantapong, P.; Yang, X.; Thornton, T.; Gan, H.; Tabler, T.; Zhao, Y. Evaluating broiler welfare and behavior as affected by growth rate and stocking density. Poult. Sci.; 2024; 103, 103459. [DOI: https://dx.doi.org/10.1016/j.psj.2024.103459]
2. Li, G.; Zhao, Y.; Porter, Z.; Purswell, J. Automated measurement of broilers under four stocking densities via faster region-based convolutional neural network. Animal; 2021; 15, 100059. [DOI: https://dx.doi.org/10.1016/j.animal.2020.100059]
3. Chen, G.; Ling, X.; Xie, M.; Xiong, Y.; Li, T.; Shui, C.; Li, C.; Xu, B.; Ma, F. Systematic evaluation of the meat qualities of free-range chicken (Xuan-Zhou) under different ages explored the optimal slaughter age. Poult. Sci.; 2024; 103, 104019. [DOI: https://dx.doi.org/10.1016/j.psj.2024.104019]
4. Park, J.; Kwon, O.; Lee, K.; Heo, Y.; Yoon, C. Ammonia and Hydrogen Sulfide Monitoring in Broiler Barns and Cattle Barns. J. Environ. Health Sci.; 2015; 41, pp. 277-288. [DOI: https://dx.doi.org/10.5668/JEHS.2015.41.5.277]
5. Charman, W. Visual standards for driving. Ophthal. Physiol. Opt.; 2015; 5, pp. 211-220. [DOI: https://dx.doi.org/10.1111/j.1475-1313.1985.tb00658.x]
6. Li, X.; Yan, F.; Hu, K.; He, X.; Ma, Z.; Yuan, Q.; Mirzoev, S.I. Design and test of the integrated machine for self-propelled manure collection and bagging in flat chicken coops. Trans. Chin. Soc. Agric. Eng.; 2024; 40, pp. 251-261.
7. Shi, Z.; Xi, L.; Ji, Z.; Cheng, P. LED illuminant improving broilers house environment and growth performance. Trans. Chin. Soc. Agric. Eng.; 2017; 33, pp. 222-227.
8. Hinojosa, C.; Caldwell, D.; Byrd, J.; Ross, M.; Stringfellow, K.; Fowlkes, E.; Lee, J.; Stayer, P.; Farnell, Y.; Farnell, M. Use of a foaming disinfectant and cleaner to reduce aerobic bacteria on poultry transport coops. J. Appl. Poult. Res.; 2015; 24, pp. 364-370. [DOI: https://dx.doi.org/10.3382/japr/pfv036]
9. Okada, H.; Itoh, T.; Suzuki, K.; Tsukamoto, K. Wireless sensor system or detection of avian influenza outbreak farms at an early stage. Proceedings of the 8th IEEE Conference on Sensors; Christchurch, New Zealand, 25–28 October 2009; pp. 1374-1377.
10. Aydin, A.; Bahr, C.; Viazzi, S.; Exadaktylos, V.; Buyse, J.; Berckmans, D. A novel method to automatically measure the feed intake of broiler chickens by sound technology. Comput. Electron. Agric.; 2008; 101, pp. 17-23. [DOI: https://dx.doi.org/10.1016/j.compag.2013.11.012]
11. Cao, Y.; Teng, G.; Yu, L.; Li, Q. Comparison of different de-noising methods in vocalization environment of laying hens including fan noise. Trans. Chin. Soc. Agric. Eng.; 2014; 30, pp. 212-218.
12. Jacob, F.; Baracho, M.; Nääs, I.; Souza, R.; Salgado, D. The use of infrared thermography in the identification of pododermatitis in broilers. Eng. Agric.; 2016; 36, pp. 253-259. [DOI: https://dx.doi.org/10.1590/1809-4430-Eng.Agric.v36n2p253-259/2016]
13. Wang, J.; Wang, N.; Li, L.; Ren, Z. Real-time behavior detection and judgement of egg breeders based on YOLO v3. Neural Comput. Appl.; 2020; 32, pp. 5471-5481. [DOI: https://dx.doi.org/10.1007/s00521-019-04645-4]
14. Zhang, Q. Application and Innovation of New Media Technology in Visual Communication Design. Agro Food Ind. Hi Tech.; 2017; 28, pp. 3170-3173.
15. Shafay, M.; Ahmad, R.; Salah, K.; Yaqoob, I.; Jayaraman, R.; Omar, M. Blockchain for deep learning: Review and open challenges. Clust. Comput.; 2023; 26, pp. 197-221. [DOI: https://dx.doi.org/10.1007/s10586-022-03582-7]
16. Lu, C. Study on Dead Birds Detection System Based on Machine Vision in Modern Chicken Farm. Master’s Thesis; JiangSu University: Zhenjiang, China, 2009.
17. Zheng, S.; Wang, L. Development of Monitoring System for Layers Rearing in Multi-Tier Vertical Cages Using Machine Vision. J. Jilin Agric. Univ.; 2009; 31, pp. 476-480.
18. Qu, Z. Study on Detection Method of Dead Chicken in Unmanned Chicken Farm. Master’s Thesis; Jilin University: Changchun, China, 2009.
19. Veera, V.; Yang, Z. Automatic Identification of Broiler Mortality Using Image Processing Technology. ASABE Meet. Present.; 2018; 18, pp. 3-5.
20. Sequeira, J.; Ribeiro, M. Human-robot interaction and robot control. Robot. Motion Control Recent.; 2006; 35, pp. 375-390.
21. Liu, H.; Chen, C.; Tsai, Y.; Hsieh, K.; Lin, H. Identifying images of dead chickens with a chicken removal system integrated with adeep learning algorithm. Sensor; 2021; 21, 3579. [DOI: https://dx.doi.org/10.3390/s21113579]
22. Hu, Z. Research on Underactuated End Effector of Dead Chicken Picking Robot. Master’s Thesis; HeBei Agricultural University: Baoding, China, 2021.
23. Zhao, W.; Chen, Y.; Cao, T. Design and kinematics analysis of robotic arm used for picking up dead chickens. J. Chin. Agric. Mech.; 2023; 44, pp. 131-136.
24. Li, Y.; Liu, Q.; Li, T.; Wu, Y.; Niu, Z.; Hou, J. Design and experiments of garlic bulbil orientation adjustment device using Jetson Nano processor. Trans. Chin. Soc. Agric. Eng.; 2021; 37, pp. 35-42.
25. Wang, S.; Hu, Y. Binocular visual positioning under inhomogeneous, transforming and fluctuating media. Trait. Du Signal; 2018; 35, pp. 253-276. [DOI: https://dx.doi.org/10.3166/ts.35.253-276]
26. Li, X.; Zhang, E.; Fang, X.; Zhai, B. Calibration Method for Industrial Robots Based on the Principle of Perigon Error Close. IEEE Access; 2022; 10, pp. 48569-48576. [DOI: https://dx.doi.org/10.1109/ACCESS.2022.3172505]
27. Yue, W.; Hua, S.; Ying, D. A Complete Analytical Solution to Hand-Eye Calibration Using Quaternions and Eigenvector-Eigenvalue Identity. J. Intell. Robot. Syst.; 2023; 109, 54.
28. Shang, Y.; Shen, J.; Wei, W.; Zheng, B. Optimization of Ball Mill Cylinder Structure Based on Response Surface Optimization Module and Multi-objective Genetic Algorithm. J. Mech. Sci. Technol.; 2024; 38, pp. 3631-3640. [DOI: https://dx.doi.org/10.1007/s12206-024-0636-5]
29. Feng, X.; Tian, D.; Wu, H. A matrix-solving hand-eye calibration method considering robot kinematic errors. J. Manuf. Process.; 2023; 99, pp. 618-635. [DOI: https://dx.doi.org/10.1016/j.jmapro.2023.05.073]
30. Cui, G.; Qiao, L.; Li, Y.; Chen, Z.; Liang, Z.; Xin, C.; Xiao, M.; Zou, X. Division of Cow Production Groups Based on SOLOv2 and Improved CNN-LSTM. Agriculture; 2023; 13, 1562. [DOI: https://dx.doi.org/10.3390/agriculture13081562]
31. Lao, F.; Teng, G.; Li, J.; Yu, L.; Li, Z. Behavior recognition method for individual laying hen based on computer vision. Trans Chin. Soc. Agric. Eng.; 2018; 28, pp. 157-163.
32. Guo, J.; He, G.; Deng, H.; Fan, W.; Xu, L.; Cao, L.; Feng, D.; Li, J.; Wu, H.; Lv, J. et al. Pigeon cleaning behavior detection algorithm based on lightweight network. Comput. Electron. Agric.; 2022; 199, 107032. [DOI: https://dx.doi.org/10.1016/j.compag.2022.107032]
33. Bist, B.; Subedi, S.; Yang, X.; Chai, L. A Novel YOLOv6 Object Detector for Monitoring Piling Behavior of Cage-Free Laying Hens. AgriEngineering; 2023; 5, pp. 905-923. [DOI: https://dx.doi.org/10.3390/agriengineering5020056]
34. Xie, W.; Kimura, M.; Takaki, K.; Asada, Y.; Iida, T.; Jia, X. Interpretable Framework of Physics-Guided Neural Network with Attention Mechanism: Simulating Paddy Field Water Temperature Variations. Water Resour. Res.; 2022; 58, e2021WR030493. [DOI: https://dx.doi.org/10.1029/2021WR030493]
35. Yun, Y.; Choi, J.; Chung, H.; Bae, K.; Moon, J. Performance evaluation of an occupant metabolic rate estimation algorithm using activity classification and object detection models. Build. Environ.; 2024; 252, 252111299. [DOI: https://dx.doi.org/10.1016/j.buildenv.2024.111299]
36. Sun, F.; Wang, Y.; Lan, P.; Zhang, X.; Chen, X.; Wang, Z. Identification of apple fruit diseases using improved YOLOv5s and transfer learning. Trans Chin. Soc. Agric. Eng.; 2022; 38, pp. 171-179.
37. Corke, P. MATLAB toolboxes: Robotics and vision for students and teachers. IEEE Robot. Autom. Mag.; 2007; 14, pp. 16-17. [DOI: https://dx.doi.org/10.1109/M-RA.2007.912004]
38. Vila-Rosado, D.; Domínguez-López, J. A MATLAB toolbox for robotic manipulators. Proceedings of the Sixth Mexican International Conference on Computer Science; Puebla, Mexico, 26–30 September 2005; pp. 256-263.
39. Liu, Y.; He, Y.; Wu, X.; Wang, W.; Zhang, L.; Lv, H. Potato Sprouting and Surface Damage Detection Method Based on Improved Faster R-CNN. Trans Chin. Soc. Agric. Mach.; 2024; 55, pp. 371-378.
40. Biswas, D.; Su, H.; Wang, C.; Stevanovic, A.; Wang, W. An automatic traffic density estimation using Single Shot Detection (SSD) and Mobile Net-SSD. Phys. Chem. Earth; 2019; 110, pp. 176-184. [DOI: https://dx.doi.org/10.1016/j.pce.2018.12.001]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
The existence of dead broilers in flat broiler houses poses significant challenges to large-scale and welfare-oriented broiler breeding. To ensure the timely identification and removal of dead broilers, a mobile device based on visual technology for grasping them was meticulously designed in this study. Among the multiple recognition models explored, the YOLOv6 model was selected due to its exceptional performance, attaining an impressive 86.1% accuracy in identification. This model, when integrated with a specially designed robotic arm, forms a potent combination for effectively handling the task of grasping dead broilers. Extensive experiments were conducted to validate the efficacy of the device. The results reveal that the device achieved an average grasping rate of dead broilers of 81.3%. These findings indicate that the proposed device holds great potential for practical field deployment, offering a reliable solution for the prompt identification and grasping of dead broilers, thereby enhancing the overall management and welfare of broiler populations.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details


1 College of Engineering, Nanjing Agricultural University, Nanjing 210031, China;
2 College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, China;
3 School of Electrical and Control Engineering, Xuzhou University of Technology, Xuzhou 221018, China;
4 Faculty of Engineering and Information Technology, University of Technology Sydney, Sydney, NSW 2007, Australia;