Full text

Turn on search term navigation

© The Author(s) 2025. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Malignant ovarian tumors (MOTs) and borderline ovarian tumors (BOTs) differ in treatment strategies and prognosis. However, accurate preoperative diagnosis remains challenging, and improving diagnostic accuracy is crucial. We developed and validated a system using artificial intelligence (AI) to integrate machine learning (ML) models based on blood test data and deep learning (DL) models based on magnetic resonance imaging (MRI) findings to distinguish between MOT and BOT. We analyzed 78 patients with malignant serous ovarian tumors and 31 with borderline serous ovarian tumors treated at our institution. A classification model was developed using ML for blood test data, and a DL model was constructed using MRI data. By integrating these models, we developed three fusion models as multimodal diagnostic AI and compared them with standalone models. The performance was evaluated using precision, recall, and accuracy. The classification model using Light Gradient Boosting Machine achieved an accuracy of 0.825, and the DL model using Visual Geometry Group 16-layer network achieved an accuracy of 0.722 for discriminating BOT from MOT. The intermediate, late, and dense fusion models achieved accuracies of 0.809, 0.776, and 0.825, respectively. Integrating multimodal information such as blood test and imaging data may enhance learning efficiency and improve diagnostic accuracy.

Details

Title
Application of multimodal integration to develop preoperative diagnostic models for borderline and malignant ovarian tumors
Author
Kunishima, Atsushi 1 ; Inaba, Daiki 2 ; Iyoshi, Shohei 3 ; Ikeda, Yoshiki 4 ; Goto, Mayuko 5 ; Muramatsu, Reina 5 ; Hashimoto, Mizuki 5 ; Yoshida, Kosuke 1 ; Mogi, Kazumasa 1 ; Yoshihara, Masato 1 ; Nagao, Yukari 1 ; Tamauchi, Satoshi 1 ; Yokoi, Akira 1 ; Yoshikawa, Nobuhisa 1 ; Niimi, Kaoru 1 ; Koizumi, Norihiro 2 ; Kajiyama, Hiroaki 1 

 Department of Obstetrics and Gynecology, Nagoya University Graduate School of Medicine, 65 Tsurumai-cho, Showa-ku, 466-8550, Nagoya-shi, Aichi, Japan (ROR: https://ror.org/04chrp450) (GRID: grid.27476.30) (ISNI: 0000 0001 0943 978X) 
 Department of Mechanical and Intelligent Systems Engineering, Graduate School of Informatics and Engineering, The University of Electro-Communications, 1-5-1 Chofugaoka, 182-8585, Chofu-shi, Tokyo, Japan (ROR: https://ror.org/02x73b849) (GRID: grid.266298.1) (ISNI: 0000 0000 9271 9936) 
 Department of Obstetrics and Gynecology, Nagoya University Graduate School of Medicine, 65 Tsurumai-cho, Showa-ku, 466-8550, Nagoya-shi, Aichi, Japan (ROR: https://ror.org/04chrp450) (GRID: grid.27476.30) (ISNI: 0000 0001 0943 978X); Institute for Advanced Research, Nagoya University, Furo-cho, Chikusa-ku, 464-8601, Nagoya, Japan (ROR: https://ror.org/04chrp450) (GRID: grid.27476.30) (ISNI: 0000 0001 0943 978X) 
 Department of Obstetrics and Gynecology, Nagoya University Graduate School of Medicine, 65 Tsurumai-cho, Showa-ku, 466-8550, Nagoya-shi, Aichi, Japan (ROR: https://ror.org/04chrp450) (GRID: grid.27476.30) (ISNI: 0000 0001 0943 978X); Department of Obstetrics and Gynecology, Kasugai Municipal Hospital, 1-1-1 Takaki- cho, 486-8510, Kasugai-shi, Aichi, Japan (ROR: https://ror.org/019ekef14) (GRID: grid.415067.1) (ISNI: 0000 0004 1772 4590) 
 Nagoya University School of Medicine, 65 Tsurumai-cho, Showa-ku, 466-8550, Nagoya-shi, Aichi, Japan (ROR: https://ror.org/04chrp450) (GRID: grid.27476.30) (ISNI: 0000 0001 0943 978X) 
Pages
37114
Section
Article
Publication year
2025
Publication date
2025
Publisher
Nature Publishing Group
e-ISSN
20452322
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
3264452578
Copyright
© The Author(s) 2025. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.