Abstract

The wound can be defined as a breakdown in the protective function of the skin. Perceiving the types of wounds at the initial stages can be more viable in the treatment process. The paramedical staff usually examines the wound with its size and healing status is determined through visual assessment. Telemedicine and rest of the healthcare services delivered through digital means aim to ensure the availability of health facilities at doorstep. Especially, in COVID-19 period these services played significant role where physical examination of patient in hospital may enhance the spread of the disease. The remote assessment of wound demands high quality of medical images or videos and localization of wound area. The manual diagnosis is not very robust and reliable because of variability in the appearance of wound site. Whereas an automatic wound detection, localization and classification may assist a physician with more accuracy and robustness. In this work, an automatic wound detection technique using YOLO v3 model is proposed. The proposed technique detects, localizes and classifies the wound into four main categories containing stitch-wound, cut-wound, open-wound and normal-skin. The experimental results show that the proposed technique is more efficient and robust with 99% accuracy and outperforms other counter parts.

Details

Title
An Automatic Wound Detection System Empowered by Deep Learning
Author
Muhammad Adnan 1 ; Asif, Muhammad 1 ; Maaz Bin Ahmad 2 ; Toqeer Mahmood 3 ; Masood, Khalid 1 ; Rehan Ashraf 3 ; CM Nadeem Faisal 3 

 Department of Computer Science, Lahore Garrison University , Lahore , Pakistan 
 College of Computing and Information Science , KIET, Karachi , Pakistan 
 Department of Computer Science, National Textile University , Faisalabad , Pakistan 
First page
012005
Publication year
2023
Publication date
Jul 2023
Publisher
IOP Publishing
ISSN
17426588
e-ISSN
17426596
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2845647591
Copyright
Published under licence by IOP Publishing Ltd. This work is published under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.