Full Text

Turn on search term navigation

© 2019. This work is published under https://creativecommons.org/licenses/by/4.0 (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Individual fish segmentation is a prerequisite for feature extraction and object identification in any machine vision system. In this paper, a method for segmentation of overlapping fish images in aquaculture was proposed. First, the shape factor was used to determine whether an overlap exists in the picture. Then, the corner points were extracted using the curvature scale space algorithm, and the skeleton obtained by the improved Zhang-Suen thinning algorithm. Finally, intersecting points were obtained, and the overlapped region was segmented. The results show that the average error rate and average segmentation efficiency of this method was 10% and 90%, respectively. Compared with the traditional watershed method, the separation point is accurate, and the segmentation accuracy is high. Thus, the proposed method achieves better performance in segmentation accuracy and effectiveness. This method can be applied to multi-target segmentation and fish behavior analysis systems, and it can effectively improve recognition precision.

Details

Title
Method for segmentation of overlapping fish images in aquaculture
Author
Zhou, Chao 1 ; Lin, Kai 2 ; Xu, Daming 1 ; Liu, Jintao 1 ; Zhang, Song 1 ; Sun, Chuanheng; Yang, Xinting

 Beijing Research Center for Information Technology in Agriculture, Beijing 100097, China 
 Beijing Fisheries Research Institute, Beijing 100068, China 
Pages
135-142
Publication year
2019
Publication date
Nov 2019
Publisher
International Journal of Agricultural and Biological Engineering (IJABE)
ISSN
19346344
e-ISSN
19346352
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2335159140
Copyright
© 2019. This work is published under https://creativecommons.org/licenses/by/4.0 (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.