Abstract

It is still challenging to make accurate diagnosis of biliary atresia (BA) with sonographic gallbladder images particularly in rural area without relevant expertise. To help diagnose BA based on sonographic gallbladder images, an ensembled deep learning model is developed. The model yields a patient-level sensitivity 93.1% and specificity 93.9% [with areas under the receiver operating characteristic curve of 0.956 (95% confidence interval: 0.928-0.977)] on the multi-center external validation dataset, superior to that of human experts. With the help of the model, the performances of human experts with various levels are improved. Moreover, the diagnosis based on smartphone photos of sonographic gallbladder images through a smartphone app and based on video sequences by the model still yields expert-level performances. The ensembled deep learning model in this study provides a solution to help radiologists improve the diagnosis of BA in various clinical application scenarios, particularly in rural and undeveloped regions with limited expertise.

It is still challenging to make accurate diagnosis of biliary atresia (BA) with sonographic gallbladder images particularly in rural areas without relevant expertise. Here, the authors develop a diagnostic deep learning model which favourable performance in comparison with human experts in multi-center external validation.

Details

Title
Ensembled deep learning model outperforms human experts in diagnosing biliary atresia from sonographic gallbladder images
Author
Zhou, Wenying 1   VIAFID ORCID Logo  ; Yang, Yang 2   VIAFID ORCID Logo  ; Cheng, Yu 3   VIAFID ORCID Logo  ; Liu Juxian 4   VIAFID ORCID Logo  ; Duan Xingxing 5   VIAFID ORCID Logo  ; Weng Zongjie 6 ; Chen, Dan 7   VIAFID ORCID Logo  ; Liang Qianhong 8 ; Fang, Qin 9   VIAFID ORCID Logo  ; Zhou Jiaojiao 4   VIAFID ORCID Logo  ; Ju Hao 10   VIAFID ORCID Logo  ; Luo Zhenhua 11   VIAFID ORCID Logo  ; Guo Weihao 1 ; Ma, Xiaoyan 7 ; Xie Xiaoyan 1   VIAFID ORCID Logo  ; Wang Ruixuan 2   VIAFID ORCID Logo  ; Zhou Luyao 1   VIAFID ORCID Logo 

 Sun Yat-sen University, Department of Medical Ultrasonics, Institute for Diagnostic and Interventional Ultrasound, The First Affiliated Hospital, Guangzhou, P. R. China (GRID:grid.12981.33) (ISNI:0000 0001 2360 039X) 
 Sun Yat-sen University, School of Computer Science and Engineering, Guangzhou, P. R. China (GRID:grid.12981.33) (ISNI:0000 0001 2360 039X) 
 Huazhong University of Science and Technology, Department of Ultrasound, Union Hospital, Tongji Medical College, Wuhan, P. R. China (GRID:grid.33199.31) (ISNI:0000 0004 0368 7223) 
 Sichuan University, Department of Ultrasound, West China Hospital, Chengdu, P. R. China (GRID:grid.13291.38) (ISNI:0000 0001 0807 1581) 
 Hunan Children’s Hospital, Department of Ultrasound, Changsha, P. R. China (GRID:grid.440223.3) 
 Affiliated Hospital of Fujian Medical University, Department of Medical Ultrasonics, Fujian Provincial Maternity and Children’s Hospital, Fuzhou City, P. R. China (GRID:grid.256112.3) (ISNI:0000 0004 1797 9307) 
 Guangdong Women and Children’ Hospital, Department of Ultrasound, Guangzhou, P. R. China (GRID:grid.459579.3) 
 Hexian Memorial Affiliated Hospital of Southern Medical University, Department of Ultrasound, Guangzhou, P. R. China (GRID:grid.284723.8) (ISNI:0000 0000 8877 7471) 
 The First People’s Hospital of Foshan, Department of Ultrasound, Foshan City, P. R. China (GRID:grid.284723.8) 
10  Shengjing Hospital of China Medical University, Department of Ultrasound, Shenyang, P. R. China (GRID:grid.412467.2) (ISNI:0000 0004 1806 3501) 
11  Sun Yat-sen University, Institute of Precision Medicine, The First Affiliated Hospital, Guangzhou, P. R. China (GRID:grid.12981.33) (ISNI:0000 0001 2360 039X) 
Publication year
2021
Publication date
2021
Publisher
Nature Publishing Group
e-ISSN
20411723
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2492786851
Copyright
© The Author(s) 2021. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.