Full text

Turn on search term navigation

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

The excavation of the ocean has led to the submersion of numerous autonomous vehicles and sensors. Hence, there is a growing need for multi-user underwater acoustic communication. On the other hand, due to the limited bandwidth of the underwater acoustic channel, downlink non-orthogonal multiple access (NOMA) is one of the fundamental pieces of technology for solving the problem of limited bandwidth, and it is expected to be beneficial for many modern wireless underwater acoustic applications. NOMA downlink underwater acoustic communication (UWA) is accomplished by broadcasting data symbols from a source station to several users, which uses superimposed coding with variable power levels to enable detection through successive interference cancellation (SIC) receivers. Nevertheless, comprehensive information of the channel condition and channel state information (CSI) are both essential for SIC receivers, but they can be difficult to obtain, particularly in an underwater environment. To address this critical issue, this research proposes downlink underwater acoustic communication using a deep neural network utilizing a 1D convolution neural network (CNN). Two cases are considered for the proposed system in the first case: in the first case, two users with different power levels and distances from the transmitter employ BPSK and QPSK modulations to support multi-user communication, while, in the second case, three users employ BPSK modulation. Users far from the base station receive the most power. The base station uses superimposed coding. The BELLHOP ray-tracing algorithm is utilized to generate the training dataset with user depth and range modifications. For training the model, a composite signal passes through the samples of the UWA channel and is fed to the model along with labels. The DNN receiver learns the characteristic of the UWA channel and does not depend on CSI. The testing CIR is used to evaluate the trained model. The results are compared to the traditional SIC receiver. The DNN-based DL NOMA underwater acoustic receiver outperformed the SIC receiver in terms of BER in simulation results for all the modulation orders.

Details

Title
Deep-Neural-Network-Based Receiver Design for Downlink Non-Orthogonal Multiple-Access Underwater Acoustic Communication
Author
Habib Hussain Zuberi 1 ; Liu, Songzuo 1 ; Bilal, Muhammad 1 ; Alharbi, Ayman 2 ; Jaffar, Amar 2   VIAFID ORCID Logo  ; Syed Agha Hussnain Mohsan 3   VIAFID ORCID Logo  ; Miyajan, Abdulaziz 2 ; Mohsin Abrar Khan 1   VIAFID ORCID Logo 

 Acoustic Science and Technology Laboratory, Harbin Engineering University, Harbin 150001, China; [email protected] (H.H.Z.); [email protected] (M.B.); [email protected] (M.A.K.); Key Laboratory of Marine Information Acquisition and Security, Harbin Engineering University, Ministry of Industry and Information Technology, Harbin 150001, China; College of Underwater Acoustic Engineering, Harbin Engineering University, Harbin 150001, China 
 Computer and Network Engineering Department, College of Computing, Umm Al-Qura University, Mecca 24231, Saudi Arabia; [email protected] (A.A.); [email protected] (A.J.); [email protected] (A.M.) 
 Ocean College, Zhejiang University, Hangzhou 310058, China; [email protected] 
First page
2184
Publication year
2023
Publication date
2023
Publisher
MDPI AG
e-ISSN
20771312
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2893296080
Copyright
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.