Convolutional neural network-based object detection model to identify gastrointestinal stromal tumors in endoscopic ultrasound images

Chang Kyo Oh, Taewan Kim, Yu Kyung Cho, Dae Young Cheung, Bo In Lee, Young Seok Cho, Jin Il Kim, Myung Gyu Choi, Han Hee Lee, Seungchul Lee

Research output: Contribution to journalArticlepeer-review

28 Scopus citations

Abstract

Background and Aim: We aimed to develop a convolutional neural network (CNN)-based object detection model for the discrimination of gastric subepithelial tumors, such as gastrointestinal stromal tumors (GISTs), and leiomyomas, in endoscopic ultrasound (EUS) images. Methods: We used 376 images from 114 patients with histologically confirmed gastric GIST or leiomyoma to train the EUS-CNN. We constructed the EUS-CNN using an EfficientNet CNN model for feature extraction and a weighted bi-directional feature pyramid network for object detection. We assessed the performance of our EUS-CNN by calculating its accuracy, sensitivity, specificity, and area under receiver operating characteristic curve (AUC) using a validation set of 170 images from 54 patients. Four EUS experts and 15 EUS trainees were asked to judge the same validation dataset, and the diagnostic yields were compared between the EUS-CNN and human assessments. Results: In the per-image analysis, the sensitivity, specificity, accuracy, and AUC of our EUS-CNN were 95.6%, 82.1%, 91.2%, and 0.9234, respectively. In the per-patient analysis, the sensitivity, specificity, accuracy, and AUC for our object detection model were 100.0%, 85.7%, 96.3%, and 0.9929, respectively. The EUS-CNN outperformed human assessment in terms of accuracy, sensitivity, and negative predictive value. Conclusions: We developed the EUS-CNN system, which demonstrated high diagnostic ability for gastric GIST prediction. This EUS-CNN system can be helpful not only for less-experienced endoscopists but also for experienced ones. Additional EUS image accumulation and prospective studies are required alongside validation in a large multicenter trial.

Original languageEnglish
Pages (from-to)3387-3394
Number of pages8
JournalJournal of Gastroenterology and Hepatology (Australia)
Volume36
Issue number12
DOIs
StatePublished - Dec 2021

Bibliographical note

Funding Information:
This research was supported in part by the Bio & Medical Technology Development Program of the National Research Foundation (NRF) funded by the Ministry of Science & ICT (NRF‐2018M3A9E8021507) (to H. H. L.); an NRF grant funded by the Korea Government (MSIT) (No. 2020R1A2C1009744) (to S. L.); an Institute for Information & Communications Technology Promotion (IITP) grant funded by the Korea government (MSIP) (No. 2019‐0‐01906, Artificial Intelligence Graduate School Program [POSTECH]) (to S. L.); and Po‐Ca Networking Groups funded by the POSTECH‐Catholic Biomedical Engineering Institute (PCBMI) (No. 5‐2019‐B0001‐00118) (to H. H. L. and S. L.). Financial support: 1

Funding Information:
Financial support: This research was supported in part by the Bio & Medical Technology Development Program of the National Research Foundation (NRF) funded by the Ministry of Science & ICT (NRF-2018M3A9E8021507) (to H.?H.?L.); an NRF grant funded by the Korea Government (MSIT) (No. 2020R1A2C1009744) (to S.?L.); an Institute for Information & Communications Technology Promotion (IITP) grant funded by the Korea government (MSIP) (No. 2019-0-01906, Artificial Intelligence Graduate School Program [POSTECH]) (to S.?L.); and Po-Ca Networking Groups funded by the POSTECH-Catholic Biomedical Engineering Institute (PCBMI) (No. 5-2019-B0001-00118) (to H.?H.?L. and S.?L.).

Publisher Copyright:
© 2021 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd

Keywords

  • Artificial intelligence
  • Deep learning
  • Endosonography
  • Gastrointestinal stromal tumors
  • Leiomyoma

Fingerprint

Dive into the research topics of 'Convolutional neural network-based object detection model to identify gastrointestinal stromal tumors in endoscopic ultrasound images'. Together they form a unique fingerprint.

Cite this