Full text

Turn on search term navigation

© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Simple Summary

Welfare farming of pigs is a kind of farming method that pays attention to the welfare of animals, which is of great significance to the health and production performance of pigs. The estimation of pig weight based on computer vision can avoid direct contact with pigs, reduce the loss caused by contact or pressure, and ultimately improve the overall breeding efficiency and economic benefits. In this paper, we proposed a new pig weight estimation method based on Mask R-CNN and machine learning methods. Our new method extracted features from a new perspective to solve the problem of illumination and body bending.The experimental results show that our method can accurately predict the weight of pigs.

Abstract

Using computer vision technology to estimate pig live weight is an important method to realize pig welfare. But there are two key issues that affect pigs’ weight estimation: one is the uneven illumination, which leads to unclear contour extraction of pigs, and the other is the bending of the pig body, which leads to incorrect pig body information. For the first one, Mask R-CNN was used to extract the contour of the pig, and the obtained mask image was converted into a binary image from which we were able to obtain a more accurate contour image. For the second one, the body length, hip width and the distance from the camera to the pig back were corrected by XGBoost and actual measured information. Then we analyzed the rationality of the extracted features. Three feature combination strategies were used to predict pig weight. In total, 1505 back images of 39 pigs obtained using Azure kinect DK were used in the numerical experiments. The highest prediction accuracy is XGBoost, with an MAE of 0.389, RMSE of 0.576, MAPE of 0.318% and R2 of 0.995. We also recommend using the Mask R-CNN + RFR method because it has fairly high precision in each strategy. The experimental results show that our proposed method has excellent performance in live weight estimation of pigs.

Details

Title
Pig Weight Estimation Method Based on a Framework Combining Mask R-CNN and Ensemble Regression Model
Author
Jiang, Sheng 1 ; Zhang, Guoxu 2 ; Shen, Zhencai 3 ; Zhong, Ping 1 ; Tan, Junyan 1 ; Liu, Jianfeng 4   VIAFID ORCID Logo 

 College of Science, China Agricultural University, Beijing 100083, China; [email protected] (S.J.); [email protected] (Z.S.); [email protected] (P.Z.); National Innovation Center for Digital Fishery, China Agricultural University, Beijing 100083, China; [email protected] 
 National Innovation Center for Digital Fishery, China Agricultural University, Beijing 100083, China; [email protected]; College of Information and Electrical Engineering, China Agricultural University, Beijing 100083, China 
 College of Science, China Agricultural University, Beijing 100083, China; [email protected] (S.J.); [email protected] (Z.S.); [email protected] (P.Z.); National Innovation Center for Digital Fishery, China Agricultural University, Beijing 100083, China; [email protected]; Key Laboratory of Agricultural Information Acquisition, Ministry of Agriculture, Beijing 100083, China; Beijing Engineering and Technology Research Center for Internet of Things in Agriculture, Beijing 100083, China 
 College of Animal Science and Technology, China Agricultural University, Beijing 100083, China 
First page
2122
Publication year
2024
Publication date
2024
Publisher
MDPI AG
e-ISSN
20762615
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
3084703497
Copyright
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.