Minireviews
Copyright ©The Author(s) 2021.
Artif Intell Gastrointest Endosc. Jun 28, 2021; 2(3): 71-78
Published online Jun 28, 2021. doi: 10.37126/aige.v2.i3.71
Table 1 Detailed information on studies concerning automatic detection by convolutional neural network in gastric cancer
Ref.
Endoscopic images
Training dataset
Test dataset
Resolution
Sensitivity %
Specificity %
Accuracy/AUC %
PPV %
NPV %
Hirasawa et al[10] (2018)WLI/NBI/chromoendoscopy images135842296300 × 30092.2NANA30.6NA
Ishioka et al[21] (2019)Video imagesNA68NA94.1NANANANA
Li et al[23] (2020)M-NBI images20000341512 × 51291.1890.6490.9190.6491.18
Ikenoyama et al[24](2021)WLI/NBI/chromoendoscopy images135842940300 × 30058.487.375.726.096.5
Table 2 Detailed information on studies concerning histological classification by convolutional neural network in gastric cancer
Ref.
Training dataset
Test dataset
Resolution
Group
AUC %
Cho et al[25] (2019)42058121280 × 640Five-category classification84.6
Cancer vs non-cancer87.7
Neoplasm vs non-neoplasm92.7
Sharma et al[27] (2017)231000 for cancer classificationNA512 × 512Cancer classification69.9
47130 for necrosis detectionNecrosis detection81.4
Iizuka et al[28] (2020)3628500512 × 512Adenocarcinoma98
Adenoma93.6
Song et al[29] (2020)21233212 from PLAGH320 × 320Benign and malignant cases and tumour subtypes98.6
595 from PUMCH99.0
987 from CHCAMS99.6
Table 3 Detailed information on studies concerning prediction of depth of tumor invasion by convolutional neural network in gastric cancer
Ref.
Dataset
Resolution
Sensitivity %
Specificity %
Accuracy/AUC %
PPV %
NPV %
Zhu et al[11] (2019)Development datasets: 5056; Validation datasets: 1264; Test dataset: 203299 × 29976.4795.5689.1689.6688.97
Yoon et al[32] (2019)11539 images were randomly organized into five different folds, and at each fold, the training: validation: testing dataset ratio was 3:1:1NA79.277.885.179.377.7
Zheng et al[34] (2020)Totally 5855, training:verification dataset ratio was 4:1512 × 557NANAT2 stage: 90; T3 stage: 93; T4 stage: 95NANA