Abstract

We aimed to develop and validate a deep learning model for automated segmentation and histomorphometry of myelinated peripheral nerve fibers from light microscopic images. A convolutional neural network integrated in the AxonDeepSeg framework was trained for automated axon/myelin segmentation using a dataset of light-microscopic cross-sectional images of osmium tetroxide-stained rat nerves including various axonal regeneration stages. In a second dataset, accuracy of automated segmentation was determined against manual axon/myelin labels. Automated morphometry results, including axon diameter, myelin sheath thickness and g-ratio were compared against manual straight-line measurements and morphometrics extracted from manual labels with AxonDeepSeg as a reference standard. The neural network achieved high pixel-wise accuracy for nerve fiber segmentations with a mean (± standard deviation) ground truth overlap of 0.93 (± 0.03) for axons and 0.99 (± 0.01) for myelin sheaths, respectively. Nerve fibers were identified with a sensitivity of 0.99 and a precision of 0.97. For each nerve fiber, the myelin thickness, axon diameter, g-ratio, solidity, eccentricity, orientation, and individual x -and y-coordinates were determined automatically. Compared to manual morphometry, automated histomorphometry showed superior agreement with the reference standard while reducing the analysis time to below 2.5% of the time needed for manual morphometry. This open-source convolutional neural network provides rapid and accurate morphometry of entire peripheral nerve cross-sections. Given its easy applicability, it could contribute to significant time savings in biomedical research while extracting unprecedented amounts of objective morphologic information from large image datasets.

Details

Title
Rapid, automated nerve histomorphometry through open-source artificial intelligence
Author
Daeschler Simeon Christian 1 ; Bourget Marie-Hélène 2 ; Derakhshan Dorsa 3 ; Sharma Vasudev 4 ; Asenov Stoyan Ivaylov 2 ; Gordon, Tessa 5 ; Cohen-Adad, Julien 6 ; Borschel Gregory Howard 7 

 SickKids Research Institute, Neuroscience and Mental Health Program, Hospital for Sick Children (SickKids), Toronto, Canada (GRID:grid.42327.30) (ISNI:0000 0004 0473 9646) 
 Polytechnique Montreal, NeuroPoly Laboratory, Institute of Biomedical Engineering, Montreal, Canada (GRID:grid.183158.6) (ISNI:0000 0004 0435 3292) 
 University of Toronto, Toronto, Canada (GRID:grid.17063.33) (ISNI:0000 0001 2157 2938) 
 Polytechnique Montreal, NeuroPoly Laboratory, Institute of Biomedical Engineering, Montreal, Canada (GRID:grid.183158.6) (ISNI:0000 0004 0435 3292); University of Toronto, Toronto, Canada (GRID:grid.17063.33) (ISNI:0000 0001 2157 2938) 
 SickKids Research Institute, Neuroscience and Mental Health Program, Hospital for Sick Children (SickKids), Toronto, Canada (GRID:grid.42327.30) (ISNI:0000 0004 0473 9646); the Hospital for Sick Children, Division of Plastic and Reconstructive Surgery, Toronto, Canada (GRID:grid.42327.30) (ISNI:0000 0004 0473 9646) 
 Polytechnique Montreal, NeuroPoly Laboratory, Institute of Biomedical Engineering, Montreal, Canada (GRID:grid.183158.6) (ISNI:0000 0004 0435 3292); CRIUGM, University of Montreal, Functional Neuroimaging Unit, Montreal, Canada (GRID:grid.14848.31) (ISNI:0000 0001 2292 3357); Mila - Quebec AI Institute, Montreal, Canada (GRID:grid.14848.31) 
 SickKids Research Institute, Neuroscience and Mental Health Program, Hospital for Sick Children (SickKids), Toronto, Canada (GRID:grid.42327.30) (ISNI:0000 0004 0473 9646); University of Toronto, Toronto, Canada (GRID:grid.17063.33) (ISNI:0000 0001 2157 2938); the Hospital for Sick Children, Division of Plastic and Reconstructive Surgery, Toronto, Canada (GRID:grid.42327.30) (ISNI:0000 0004 0473 9646); Indiana University School of Medicine, Indianapolis, USA (GRID:grid.257413.6) (ISNI:0000 0001 2287 3919) 
Publication year
2022
Publication date
2022
Publisher
Nature Publishing Group
e-ISSN
20452322
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2648333106
Copyright
© The Author(s) 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.