Content area

Abstract

Interpretability in machine learning models is important in high-stakes decisions such as whether to order a biopsy based on a mammographic exam. Mammography poses important challenges that are not present in other computer vision tasks: datasets are small, confounding information is present and it can be difficult even for a radiologist to decide between watchful waiting and biopsy based on a mammogram alone. In this work we present a framework for interpretable machine learning-based mammography. In addition to predicting whether a lesion is malignant or benign, our work aims to follow the reasoning processes of radiologists in detecting clinically relevant semantic features of each image, such as the characteristics of the mass margins. The framework includes a novel interpretable neural network algorithm that uses case-based reasoning for mammography. Our algorithm can incorporate a combination of data with whole image labelling and data with pixel-wise annotations, leading to better accuracy and interpretability even with a small number of images. Our interpretable models are able to highlight the classification-relevant parts of the image, whereas other methods highlight healthy tissue and confounding information. Our models are decision aids—rather than decision makers—and aim for better overall human–machine collaboration. We do not observe a loss in mass margin classification accuracy over a black box neural network trained on the same data.

The black-box nature of neural networks is a concern for high-stakes medical applications in which decisions must be based on medically relevant features. The authors develop an interpretable machine learning-based framework that aims to follow the reasoning processes of radiologists in providing predictions for cancer diagnosis in mammography.

Details

Title
A case-based interpretable deep learning model for classification of mass lesions in digital mammography
Author
Barnett, Alina Jade 1   VIAFID ORCID Logo  ; Schwartz, Fides Regina 2   VIAFID ORCID Logo  ; Tao Chaofan 1 ; Chen, Chaofan 3 ; Ren Yinhao 4 ; Lo, Joseph Y 5 ; Rudin, Cynthia 6   VIAFID ORCID Logo 

 Duke University, Department of Computer Science, Durham, USA (GRID:grid.26009.3d) (ISNI:0000 0004 1936 7961) 
 Duke University, Department of Radiology, Durham, USA (GRID:grid.26009.3d) (ISNI:0000 0004 1936 7961) 
 University of Maine, School of Computing and Information Science, Orono, USA (GRID:grid.21106.34) (ISNI:0000000121820794) 
 Duke University, Department of Biomedical Engineering, Durham, USA (GRID:grid.26009.3d) (ISNI:0000 0004 1936 7961) 
 Duke University, Department of Radiology, Durham, USA (GRID:grid.26009.3d) (ISNI:0000 0004 1936 7961); Duke University, Department of Biomedical Engineering, Durham, USA (GRID:grid.26009.3d) (ISNI:0000 0004 1936 7961); Duke University, Department of Electrical and Computer Engineering, Durham, USA (GRID:grid.26009.3d) (ISNI:0000 0004 1936 7961) 
 Duke University, Department of Computer Science, Durham, USA (GRID:grid.26009.3d) (ISNI:0000 0004 1936 7961); Duke University, Department of Electrical and Computer Engineering, Durham, USA (GRID:grid.26009.3d) (ISNI:0000 0004 1936 7961); Duke University, Department of Statistical Science, Durham, USA (GRID:grid.26009.3d) (ISNI:0000 0004 1936 7961); Duke University, Department of Biostatistics & Bioinformatics, Durham, USA (GRID:grid.26009.3d) (ISNI:0000 0004 1936 7961) 
Pages
1061-1070
Publication year
2021
Publication date
Dec 2021
Publisher
Nature Publishing Group
e-ISSN
25225839
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2622645976
Copyright
© The Author(s), under exclusive licence to Springer Nature Limited 2021.