Full text

Turn on search term navigation

© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Ocean exploration is crucial for utilizing its extensive resources. Images captured by underwater robots suffer from issues such as color distortion and reduced contrast. To address the issue, an innovative enhancement algorithm is proposed, which integrates Transformer and Convolutional Neural Network (CNN) in a parallel fusion manner. Firstly, a novel transformer model is introduced to capture local features, employing peak-signal-to-noise ratio (PSNR) attention and linear operations. Subsequently, to extract global features, both temporal and frequency domain features are incorporated to construct the convolutional neural network. Finally, the image’s high and low frequency information are utilized to fuse different features. To demonstrate the algorithm’s effectiveness, underwater images with various levels of color distortion are selected for both qualitative and quantitative analyses. The experimental results demonstrate that our approach outperforms other mainstream methods, achieving superior PSNR and structural similarity index measure (SSIM) metrics and yielding a detection performance improvement of over ten percent.

Details

Title
Enhancement of Underwater Images through Parallel Fusion of Transformer and CNN
Author
Liu, Xiangyong 1   VIAFID ORCID Logo  ; Chen, Zhixin 2 ; Xu, Zhiqiang 2 ; Zheng, Ziwei 3 ; Ma, Fengshuang 2 ; Wang, Yunjie 2 

 Fishery Machinery and Instrument Research Institute, Chinese Academy of Fishery Science, Shanghai 200092, China; [email protected] (X.L.); [email protected] (Z.C.); [email protected] (F.M.); [email protected] (Y.W.); State Key Laboratory of the Internet of Things for Smart City (IOTSC), University of Macau, Macau 999078, China 
 Fishery Machinery and Instrument Research Institute, Chinese Academy of Fishery Science, Shanghai 200092, China; [email protected] (X.L.); [email protected] (Z.C.); [email protected] (F.M.); [email protected] (Y.W.) 
 Digital Industry Research Institute, Zhejiang Wanli University, No. 8 South Qian Hu Road, Ningbo 315199, China; [email protected] 
First page
1467
Publication year
2024
Publication date
2024
Publisher
MDPI AG
e-ISSN
20771312
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
3110601297
Copyright
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.