Abstract

Low-field portable magnetic resonance imaging (MRI) scanners are more accessible, cost-effective, sustainable with lower carbon emissions than superconducting high-field MRI scanners. However, the images produced have relatively poor image quality, lower signal-to-noise ratio, and limited spatial resolution. This study develops and investigates an image-to-image translation deep learning model, LoHiResGAN, to enhance the quality of low-field (64mT) MRI scans and generate synthetic high-field (3T) MRI scans. We employed a paired dataset comprising T1- and T2-weighted MRI sequences from the 64mT and 3T and compared the performance of the LoHiResGAN model with other state-of-the-art models, including GANs, CycleGAN, U-Net, and cGAN. Our proposed method demonstrates superior performance in terms of image quality metrics, such as normalized root-mean-squared error, structural similarity index measure, peak signal-to-noise ratio, and perception-based image quality evaluator. Additionally, we evaluated the accuracy of brain morphometry measurements for 33 brain regions across the original 3T, 64mT, and synthetic 3T images. The results indicate that the synthetic 3T images created using our proposed LoHiResGAN model significantly improve the image quality of low-field MRI data compared to other methods (GANs, CycleGAN, U-Net, cGAN) and provide more consistent brain morphometry measurements across various brain regions in reference to 3T. Synthetic images generated by our method demonstrated high quality both quantitatively and qualitatively. However, additional research, involving diverse datasets and clinical validation, is necessary to fully understand its applicability for clinical diagnostics, especially in settings where high-field MRI scanners are less accessible.

Details

Title
Improving portable low-field MRI image quality through image-to-image translation using paired low- and high-field images
Author
Islam, Kh Tohidul 1 ; Zhong, Shenjun 2 ; Zakavi, Parisa 1 ; Chen, Zhifeng 3 ; Kavnoudias, Helen 4 ; Farquharson, Shawna 5 ; Durbridge, Gail 6 ; Barth, Markus 7 ; McMahon, Katie L. 8 ; Parizel, Paul M. 9 ; Dwyer, Andrew 10 ; Egan, Gary F. 1 ; Law, Meng 4 ; Chen, Zhaolin 3 

 Monash University, Monash Biomedical Imaging, Melbourne, Australia (GRID:grid.1002.3) (ISNI:0000 0004 1936 7857) 
 Monash University, Monash Biomedical Imaging, Melbourne, Australia (GRID:grid.1002.3) (ISNI:0000 0004 1936 7857); Australian National Imaging Facility, Brisbane, Australia (GRID:grid.507684.8) 
 Monash University, Monash Biomedical Imaging, Melbourne, Australia (GRID:grid.1002.3) (ISNI:0000 0004 1936 7857); Monash University, Department of Data Science and AI, Faculty of Information Technology, Melbourne, Australia (GRID:grid.1002.3) (ISNI:0000 0004 1936 7857) 
 Monash University, Department of Neuroscience, Central Clinical School, Melbourne, Australia (GRID:grid.1002.3) (ISNI:0000 0004 1936 7857); Alfred Hospital, Department of Radiology, Melbourne, Australia (GRID:grid.1623.6) (ISNI:0000 0004 0432 511X) 
 Australian National Imaging Facility, Brisbane, Australia (GRID:grid.507684.8) 
 University of Queensland, Herston Imaging Research Facility, Brisbane, Australia (GRID:grid.1003.2) (ISNI:0000 0000 9320 7537) 
 University of Queensland, School of Information Technology and Electrical Engineering and Centre for Advanced Imaging, Brisbane, Australia (GRID:grid.1003.2) (ISNI:0000 0000 9320 7537) 
 Queensland University of Technology, School of Clinical Science, Herston Imaging Research Facility, Brisbane, Australia (GRID:grid.1024.7) (ISNI:0000 0000 8915 0953) 
 Royal Perth Hospital, David Hartley Chair of Radiology, Department of Radiology, Perth, Australia (GRID:grid.416195.e) (ISNI:0000 0004 0453 3875); University of Western Australia, Medical School, Perth, Australia (GRID:grid.1012.2) (ISNI:0000 0004 1936 7910) 
10  South Australian Health and Medical Research Institute, Adelaide, Australia (GRID:grid.430453.5) (ISNI:0000 0004 0565 2606) 
Pages
21183
Publication year
2023
Publication date
2023
Publisher
Nature Publishing Group
e-ISSN
20452322
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2896083852
Copyright
© The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.