It appears you don't have support to open PDFs in this web browser. To view this file, Open with your PDF reader
Abstract
Live fish recognition and classification play a pivotal role in underwater understanding, because it help scientists to control the subsea inventory in order to aid fishery management. However, despite technological progress, fish recognition systems still have many limitations on observing fish. Difficulties in visualizing optical images can arise due to external attenua-tion, scattering properties of water. Optical underwater imaging systems can also have detection problems such as changing appearance/orientation of objects, and changes in the scene. In this paper, we propose a new object classification system for underwater optical images. The proposed method is based on robust feature extraction from fish pattern. A specific pre-processing method is used in order to improve the recognition accuracy. A mean-shift algorithm is charged to segment the images and to isolate objects from background in the raw images. The training data is processed by Principal component analysis (PCA), where we calculate the prior probability inter-features. The decision is given using a combined Bayesian Artificial Neural networks (ANNs). ANNs will calculate non linear relationship of the extracted features, and the posterior probabilities. These probabilities will be verified in the last step in order to keep (or reject) the decision. The comparison of results with state of the art methods shows that the proposed system outperforms most of the solutions in different environmental conditions. The solution simultaneously deals with artificial and reel environment. The results obtained in the simulation indicate that the proposed approach provides a good precision to make distinguish between different fish species. An average accuracy of 94.6% is achieved using the proposed recognition method.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer