Abstract

Human Skin cancer is commonly detected visually through clinical screening followed by a dermoscopic examination. However, automated skin lesion classification remains challenging due to the visual similarities between benign and melanoma lesions. In this work, the authors proposed a new Artificial Intelligence-Based method to classify skin lesions. In this method, we used Residual Deep Convolution Neural Network. We implemented several convolution filters for multi-layer feature extraction and cross-channel correlation by sliding dot product filters instead of sliding filters along the horizontal axis. The proposed method overcomes the imbalanced dataset problem by converting the dataset from image and label to vector of image and weight. The proposed method is tested and evaluated using the challenging datasets ISIC-2019 & ISIC-2020. It outperformed the existing deep convolutional networks in the multiclass classification of skin lesions.

Details

Title
Skin-Net: a novel deep residual network for skin lesions classification using multilevel feature extraction and cross-channel correlation with detection of outlier
Author
Alsahafi, Yousef S. 1 ; Kassem, Mohamed A. 2   VIAFID ORCID Logo  ; Hosny, Khalid M. 3   VIAFID ORCID Logo 

 Khulis College, University of Jeddah, Department of Information Technology, Jeddah, Saudi Arabia (GRID:grid.460099.2) 
 Kafrelsheikh University, Department of Robotics and Intelligent Machines, Faculty of Artificial Intelligence, Kafr El-Sheikh, Egypt (GRID:grid.411978.2) (ISNI:0000 0004 0578 3577) 
 Zagazig University, Department of Information Technology, Faculty of Computers and Informatics, Zagazig, Egypt (GRID:grid.31451.32) (ISNI:0000 0001 2158 2757) 
Pages
105
Publication year
2023
Publication date
Jun 2023
Publisher
Springer Nature B.V.
e-ISSN
21961115
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2827368883
Copyright
© The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.