Full text

Turn on search term navigation

© 2021 Tanti et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Propagation Phase Contrast Synchrotron Microtomography (PPC-SRμCT) is the gold standard for non-invasive and non-destructive access to internal structures of archaeological remains. In this analysis, the virtual specimen needs to be segmented to separate different parts or materials, a process that normally requires considerable human effort. In the Automated SEgmentation of Microtomography Imaging (ASEMI) project, we developed a tool to automatically segment these volumetric images, using manually segmented samples to tune and train a machine learning model. For a set of four specimens of ancient Egyptian animal mummies we achieve an overall accuracy of 94–98% when compared with manually segmented slices, approaching the results of off-the-shelf commercial software using deep learning (97–99%) at much lower complexity. A qualitative analysis of the segmented output shows that our results are close in terms of usability to those from deep learning, justifying the use of these techniques.

Details

Title
Automated segmentation of microtomography imaging of Egyptian mummies
Author
Tanti, Marc; Berruyer, Camille; Tafforeau, Paul; Muscat, Adrian; Farrugia, Reuben; Scerri, Kenneth; Valentino, Gianluca; Solé, V Armando; Briffa, Johann A
First page
e0260707
Section
Research Article
Publication year
2021
Publication date
Dec 2021
Publisher
Public Library of Science
e-ISSN
19326203
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2610390358
Copyright
© 2021 Tanti et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.