Full text

Turn on search term navigation

© 2022 Author(s) (or their employer(s)) 2022. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ. http://creativecommons.org/licenses/by-nc/4.0/ This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See:  http://creativecommons.org/licenses/by-nc/4.0/ . Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Rationale

Spirometry and plethysmography are the gold standard pulmonary function tests (PFT) for diagnosis and management of lung disease. Due to the inaccessibility of plethysmography, spirometry is often used alone but this leads to missed or misdiagnoses as spirometry cannot identify restrictive disease without plethysmography. We aimed to develop a deep learning model to improve interpretation of spirometry alone.

Methods

We built a multilayer perceptron model using full PFTs from 748 patients, interpreted according to international guidelines. Inputs included spirometry (forced vital capacity, forced expiratory volume in 1 s, forced mid-expiratory flow25–75), plethysmography (total lung capacity, residual volume) and biometrics (sex, age, height). The model was developed with 2582 PFTs from 477 patients, randomly divided into training (80%), validation (10%) and test (10%) sets, and refined using 1245 previously unseen PFTs from 271 patients, split 50/50 as validation (136 patients) and test (135 patients) sets. Only one test per patient was used for each of 10 experiments conducted for each input combination. The final model was compared with interpretation of 82 spirometry tests by 6 trained pulmonologists and a decision tree.

Results

Accuracies from the first 477 patients were similar when inputs included biometrics+spirometry+plethysmography (95%±3%) vs biometrics+spirometry (90%±2%). Model refinement with the next 271 patients improved accuracies with biometrics+pirometry (95%±2%) but no change for biometrics+spirometry+plethysmography (95%±2%). The final model significantly outperformed (94.67%±2.63%, p<0.01 for both) interpretation of 82 spirometry tests by the decision tree (75.61%±0.00%) and pulmonologists (66.67%±14.63%).

Conclusions

Deep learning improves the diagnostic acumen of spirometry and classifies lung physiology better than pulmonologists with accuracies comparable to full PFTs.

Details

Title
Deep learning using multilayer perception improves the diagnostic acumen of spirometry: a single-centre Canadian study
Author
Mac, Amanda 1 ; Xu, Tong 1 ; Wu, Joyce K Y 2   VIAFID ORCID Logo  ; Belousova, Natalia 2   VIAFID ORCID Logo  ; Kitazawa, Haruna 2   VIAFID ORCID Logo  ; Vozoris, Nick 1   VIAFID ORCID Logo  ; Rozenberg, Dmitry 2   VIAFID ORCID Logo  ; Ryan, Clodagh M 2   VIAFID ORCID Logo  ; Valaee, Shahrokh 3   VIAFID ORCID Logo  ; Chung-Wai Chow 2   VIAFID ORCID Logo 

 Medicine, Division of Respirology, University of Toronto, Toronto, Ontario, Canada 
 Medicine, Division of Respirology, University of Toronto, Toronto, Ontario, Canada; Department of Medicine, University Health Network, Toronto, Ontario, Canada 
 Electrical and Computer Engineeing, University of Toronto, Toronto, Ontario, Canada 
First page
e001396
Section
Respiratory physiology
Publication year
2022
Publication date
2022
Publisher
BMJ Publishing Group LTD
e-ISSN
20524439
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2832011468
Copyright
© 2022 Author(s) (or their employer(s)) 2022. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ. http://creativecommons.org/licenses/by-nc/4.0/ This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See:  http://creativecommons.org/licenses/by-nc/4.0/ . Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.