Full text

Turn on search term navigation

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

The majority of cancer-related deaths globally are due to lung cancer, which also has the second-highest mortality rate. The segmentation of lung tumours, treatment evaluation, and tumour stage classification have become significantly more accessible with the advent of PET/CT scans. With the advent of PET/CT scans, it is possible to obtain both functioning and anatomic data during a single examination. However, integrating images from different modalities can indeed be time-consuming for medical professionals and remains a challenging task. This challenge arises from several factors, including differences in image acquisition techniques, image resolutions, and the inherent variations in the spectral and temporal data captured by different imaging modalities. Artificial Intelligence (AI) methodologies have shown potential in the automation of image integration and segmentation. To address these challenges, multimodal fusion approach-based U-Net architecture (early fusion, late fusion, dense fusion, hyper-dense fusion, and hyper-dense VGG16 U-Net) are proposed for lung tumour segmentation. Dice scores of 73% show that hyper-dense VGG16 U-Net is superior to the other four proposed models. The proposed method can potentially aid medical professionals in detecting lung cancer at an early stage.

Details

Title
Hyper-Dense_Lung_Seg: Multimodal-Fusion-Based Modified U-Net for Lung Tumour Segmentation Using Multimodality of CT-PET Scans
Author
Alshmrani, Goram Mufarah 1 ; Ni, Qiang 2 ; Jiang, Richard 2 ; Nada Muhammed 3 

 School of Computing and Commutations, Lancaster University, Lancaster LA1 4YW, UK; [email protected] (Q.N.); [email protected] (R.J.); College of Computing and Information Technology, University of Bisha, Bisha 67714, Saudi Arabia 
 School of Computing and Commutations, Lancaster University, Lancaster LA1 4YW, UK; [email protected] (Q.N.); [email protected] (R.J.) 
 Computers and Control Engineering Department, Faculty of Engineering, Tanta University, Tanta 31733, Egypt; [email protected] 
First page
3481
Publication year
2023
Publication date
2023
Publisher
MDPI AG
e-ISSN
20754418
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2893003860
Copyright
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.