Abstract

There is an increasing interest in developing artificial intelligence (AI) systems to process and interpret electronic health records (EHRs). Natural language processing (NLP) powered by pretrained language models is the key technology for medical AI systems utilizing clinical narratives. However, there are few clinical language models, the largest of which trained in the clinical domain is comparatively small at 110 million parameters (compared with billions of parameters in the general domain). It is not clear how large clinical language models with billions of parameters can help medical AI systems utilize unstructured EHRs. In this study, we develop from scratch a large clinical language model—GatorTron—using >90 billion words of text (including >82 billion words of de-identified clinical text) and systematically evaluate it on five clinical NLP tasks including clinical concept extraction, medical relation extraction, semantic textual similarity, natural language inference (NLI), and medical question answering (MQA). We examine how (1) scaling up the number of parameters and (2) scaling up the size of the training data could benefit these NLP tasks. GatorTron models scale up the clinical language model from 110 million to 8.9 billion parameters and improve five clinical NLP tasks (e.g., 9.6% and 9.5% improvement in accuracy for NLI and MQA), which can be applied to medical AI systems to improve healthcare delivery. The GatorTron models are publicly available at: https://catalog.ngc.nvidia.com/orgs/nvidia/teams/clara/models/gatortron_og.

Details

Title
A large language model for electronic health records
Author
Yang, Xi 1 ; Chen, Aokun 1 ; PourNejatian, Nima 2 ; Shin, Hoo Chang 2 ; Smith, Kaleb E. 2 ; Parisien, Christopher 2 ; Compas, Colin 2 ; Martin, Cheryl 2 ; Costa, Anthony B. 2 ; Flores, Mona G. 2   VIAFID ORCID Logo  ; Zhang, Ying 3   VIAFID ORCID Logo  ; Magoc, Tanja 4 ; Harle, Christopher A. 5 ; Lipori, Gloria 6 ; Mitchell, Duane A. 7 ; Hogan, William R. 8   VIAFID ORCID Logo  ; Shenkman, Elizabeth A. 8   VIAFID ORCID Logo  ; Bian, Jiang 1   VIAFID ORCID Logo  ; Wu, Yonghui 1   VIAFID ORCID Logo 

 University of Florida, Department of Health Outcomes and Biomedical Informatics, College of Medicine, Gainesville, USA (GRID:grid.15276.37) (ISNI:0000 0004 1936 8091); University of Florida Health Cancer Center, Cancer Informatics and eHealth core, Gainesville, USA (GRID:grid.430508.a) (ISNI:0000 0004 4911 114X) 
 NVIDIA, Santa Clara, USA (GRID:grid.451133.1) (ISNI:0000 0004 0458 4453) 
 University of Florida, Research Computing, Gainesville, USA (GRID:grid.15276.37) (ISNI:0000 0004 1936 8091) 
 University of Florida, Integrated Data Repository Research Services, Gainesville, USA (GRID:grid.15276.37) (ISNI:0000 0004 1936 8091) 
 University of Florida, Department of Health Outcomes and Biomedical Informatics, College of Medicine, Gainesville, USA (GRID:grid.15276.37) (ISNI:0000 0004 1936 8091); University of Florida, Integrated Data Repository Research Services, Gainesville, USA (GRID:grid.15276.37) (ISNI:0000 0004 1936 8091) 
 University of Florida, Integrated Data Repository Research Services, Gainesville, USA (GRID:grid.15276.37) (ISNI:0000 0004 1936 8091); University of Florida, Lillian S. Wells Department of Neurosurgery, UF Clinical and Translational Science Institute, Gainesville, USA (GRID:grid.15276.37) (ISNI:0000 0004 1936 8091) 
 University of Florida, Lillian S. Wells Department of Neurosurgery, UF Clinical and Translational Science Institute, Gainesville, USA (GRID:grid.15276.37) (ISNI:0000 0004 1936 8091) 
 University of Florida, Department of Health Outcomes and Biomedical Informatics, College of Medicine, Gainesville, USA (GRID:grid.15276.37) (ISNI:0000 0004 1936 8091) 
Pages
194
Publication year
2022
Publication date
Dec 2022
Publisher
Nature Publishing Group
e-ISSN
23986352
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2758195254
Copyright
© The Author(s) 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.