Abstract

Visual morphology assessment is routinely used for evaluating of embryo quality and selecting human blastocysts for transfer after in vitro fertilization (IVF). However, the assessment produces different results between embryologists and as a result, the success rate of IVF remains low. To overcome uncertainties in embryo quality, multiple embryos are often implanted resulting in undesired multiple pregnancies and complications. Unlike in other imaging fields, human embryology and IVF have not yet leveraged artificial intelligence (AI) for unbiased, automated embryo assessment. We postulated that an AI approach trained on thousands of embryos can reliably predict embryo quality without human intervention. We implemented an AI approach based on deep neural networks (DNNs) to select highest quality embryos using a large collection of human embryo time-lapse images (about 50,000 images) from a high-volume fertility center in the United States. We developed a framework (STORK) based on Google’s Inception model. STORK predicts blastocyst quality with an AUC of >0.98 and generalizes well to images from other clinics outside the US and outperforms individual embryologists. Using clinical data for 2182 embryos, we created a decision tree to integrate embryo quality and patient age to identify scenarios associated with pregnancy likelihood. Our analysis shows that the chance of pregnancy based on individual embryos varies from 13.8% (age ≥41 and poor-quality) to 66.3% (age <37 and good-quality) depending on automated blastocyst quality assessment and patient age. In conclusion, our AI-driven approach provides a reproducible way to assess embryo quality and uncovers new, potentially personalized strategies to select embryos.

Details

Title
Deep learning enables robust assessment and selection of human blastocysts after in vitro fertilization
Author
Khosravi Pegah 1 ; Kazemi Ehsan 2 ; Zhan Qiansheng 3 ; Malmsten, Jonas E 3   VIAFID ORCID Logo  ; Toschi, Marco 3 ; Zisimopoulos Pantelis 1 ; Sigaras Alexandros 1 ; Lavery, Stuart 4 ; Cooper Lee A D 5   VIAFID ORCID Logo  ; Hickman, Cristina 4 ; Meseguer Marcos 6 ; Rosenwaks Zev 3 ; Elemento Olivier 7 ; Zaninovic Nikica 3 ; Hajirasouliha Iman 1 

 Weill Cornell Medicine of Cornell University, Institute for Computational Biomedicine, Department of Physiology and Biophysics, New York, USA (GRID:grid.5386.8) (ISNI:000000041936877X); Weill Cornell Medicine, Caryl and Israel Englander Institute for Precision Medicine, The Meyer Cancer Center, New York, USA (GRID:grid.5386.8) (ISNI:000000041936877X) 
 Yale University, Yale Institute for Network Science, New Haven, USA (GRID:grid.47100.32) (ISNI:0000000419368710) 
 Weill Cornell Medicine, The Ronald O. Perelman and Claudia Cohen Center for Reproductive Medicine, New York, USA (GRID:grid.5386.8) (ISNI:000000041936877X) 
 Imperial College, Institute of Reproduction and Developmental Biology, London, UK (GRID:grid.7445.2) (ISNI:0000 0001 2113 8111) 
 Emory University School of Medicine, Department of Biomedical Informatics, Atlanta, USA (GRID:grid.189967.8) (ISNI:0000 0001 0941 6502) 
 Universidad de Valencia, Instituto Valenciano de Infertilidad, Valencia, Spain (GRID:grid.5338.d) (ISNI:0000 0001 2173 938X) 
 Weill Cornell Medicine of Cornell University, Institute for Computational Biomedicine, Department of Physiology and Biophysics, New York, USA (GRID:grid.5386.8) (ISNI:000000041936877X); Weill Cornell Medicine, Caryl and Israel Englander Institute for Precision Medicine, The Meyer Cancer Center, New York, USA (GRID:grid.5386.8) (ISNI:000000041936877X); Weill Cornell Medicine, WorldQuant Initiative for Quantitative Prediction, New York, USA (GRID:grid.5386.8) (ISNI:000000041936877X) 
Publication year
2019
Publication date
Dec 2019
Publisher
Nature Publishing Group
e-ISSN
23986352
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2528862264
Copyright
© The Author(s) 2019. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.