Meta-Analysis Open Access
Copyright ©The Author(s) 2023. Published by Baishideng Publishing Group Inc. All rights reserved.
World J Gastrointest Oncol. Nov 15, 2023; 15(11): 1998-2016
Published online Nov 15, 2023. doi: 10.4251/wjgo.v15.i11.1998
Application of convolutional neural network-based endoscopic imaging in esophageal cancer or high-grade dysplasia: A systematic review and meta-analysis
Jun-Qi Zhang, The Fifth Clinical Medical College, Shanxi Medical University, Taiyuan 030001, Shanxi Province, China
Jun-Jie Mi, Department of Gastroenterology, Shanxi Provincial People’s Hospital, Taiyuan 030012, Shanxi Province, China
Rong Wang, Department of Gastroenterology, The Fifth Hospital of Shanxi Medical University (Shanxi Provincial People’s Hospital), Taiyuan 030012, Shanxi Province, China
ORCID number: Jun-Qi Zhang (0000-0002-3876-0502); Jun-Jie Mi (0000-0002-7999-4188); Rong Wang (0000-0002-2019-8929).
Author contributions: Zhang JQ conceived, designed the experiments and wrote a draft manuscript; Wang R analyzed, interpreted the results of the experiments and revised the manuscript; Mi JJ collected the clinical data and performed the experiments; and all authors read and approved the final manuscript.
Supported by the Special Program for Science and Technology Cooperation and Exchange of Shanxi, No. 202104041101034.
Conflict-of-interest statement: All the authors report no relevant conflicts of interest for this article.
PRISMA 2009 Checklist statement: The authors have read the PRISMA 2009 Checklist, and the manuscript was prepared and revised according to the PRISMA 2009 Checklist.
Open-Access: This article is an open-access article that was selected by an in-house editor and fully peer-reviewed by external reviewers. It is distributed in accordance with the Creative Commons Attribution NonCommercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: https://creativecommons.org/Licenses/by-nc/4.0/
Corresponding author: Rong Wang, MM, Chief Physician, Professor, Department of Gastroenterology, The Fifth Hospital of Shanxi Medical University (Shanxi Provincial People’s Hospital), No. 29 Shuangta West Street, Taiyuan 030012, Shanxi Province, China. wangxiongzai@126.com
Received: August 1, 2023
Peer-review started: August 1, 2023
First decision: August 22, 2023
Revised: September 5, 2023
Accepted: October 11, 2023
Article in press: October 11, 2023
Published online: November 15, 2023
Processing time: 105 Days and 21.8 Hours

Abstract
BACKGROUND

Esophageal cancer is the seventh-most common cancer type worldwide, accounting for 5% of death from malignancy. Development of novel diagnostic techniques has facilitated screening, early detection, and improved prognosis. Convolutional neural network (CNN)-based image analysis promises great potential for diagnosing and determining the prognosis of esophageal cancer, enabling even early detection of dysplasia.

AIM

To conduct a meta-analysis of the diagnostic accuracy of CNN models for the diagnosis of esophageal cancer and high-grade dysplasia (HGD).

METHODS

PubMed, EMBASE, Web of Science and Cochrane Library databases were searched for articles published up to November 30, 2022. We evaluated the diagnostic accuracy of using the CNN model with still image-based analysis and with video-based analysis for esophageal cancer or HGD, as well as for the invasion depth of esophageal cancer. The pooled sensitivity, pooled specificity, positive likelihood ratio (PLR), negative likelihood ratio (NLR), diagnostic odds ratio (DOR) and area under the curve (AUC) were estimated, together with the 95% confidence intervals (CI). A bivariate method and hierarchical summary receiver operating characteristic method were used to calculate the diagnostic test accuracy of the CNN model. Meta-regression and subgroup analyses were used to identify sources of heterogeneity.

RESULTS

A total of 28 studies were included in this systematic review and meta-analysis. Using still image-based analysis for the diagnosis of esophageal cancer or HGD provided a pooled sensitivity of 0.95 (95%CI: 0.92-0.97), pooled specificity of 0.92 (0.89-0.94), PLR of 11.5 (8.3-16.0), NLR of 0.06 (0.04-0.09), DOR of 205 (115-365), and AUC of 0.98 (0.96-0.99). When video-based analysis was used, a pooled sensitivity of 0.85 (0.77-0.91), pooled specificity of 0.73 (0.59-0.83), PLR of 3.1 (1.9-5.0), NLR of 0.20 (0.12-0.34), DOR of 15 (6-38) and AUC of 0.87 (0.84-0.90) were found. Prediction of invasion depth resulted in a pooled sensitivity of 0.90 (0.87-0.92), pooled specificity of 0.83 (95%CI: 0.76-0.88), PLR of 7.8 (1.9-32.0), NLR of 0.10 (0.41-0.25), DOR of 118 (11-1305), and AUC of 0.95 (0.92-0.96).

CONCLUSION

CNN-based image analysis in diagnosing esophageal cancer and HGD is an excellent diagnostic method with high sensitivity and specificity that merits further investigation in large, multicenter clinical trials.

Key Words: Esophageal cancer, High-grade dysplasia, Convolutional neural network, Deep learning, Systematic review, Meta-analysis

Core Tip: This systematic review provides a meta-analysis of 28 studies evaluating the accuracy of convolutional neural network (CNN) models for diagnosing esophageal cancer and high-grade dysplasia, and for predicting the invasion depth of esophageal cancer. It also establishes a theoretical foundation for the clinical application of CNN models. Based on this meta-analysis, CNN-based image analysis may have great potential for diagnosing and estimating the prognosis of esophageal cancer, though further study is needed.



INTRODUCTION

In global data reported by the International Agency for Research on Cancer, esophageal cancer was the seventh-most common malignancy in incidence and sixth in mortality worldwide in 2020[1]. Esophageal cancer has two main histological subtypes: Esophageal squamous cell carcinoma (ESCC) and esophageal adenocarcinoma (EAC)[2]. ESCC is more common in Asian countries, accounting for approximately 87% of all cases of esophageal cancer, whereas EAC is more common in western countries and has been increasing in incidence recently[3-5]. Diverse grades of dysplasia, especially high-grade dysplasia (HGD), are precancerous lesions known to progress to esophageal cancer[3,6]. The majority of patient with esophageal cancer are diagnosed with advanced disease due to the lack of symptoms at earlier stages, resulting in a five-year survival rate of less than 20%[7,8]. When diagnosed and treated early, however, the five-year survival rate can increase to more than 85%[9,10]. Moreover, the choice of treatment modalities and the prognosis of esophageal cancer patients depend heavily on the predicted invasion depth[11,12].

Traditional endoscopy is frequently used to detect esophageal cancer and to estimate its invasion depth. However, detection of the minor changes in the surrounding mucosa of early esophageal cancer using only white light imaging (WLI) endoscopy remains challenging[13,14]. Although iodine staining provides greater accuracy in detection, it is employed infrequently during screening, as it causes discomfort, and allergies to iodine are not infrequent[15,16]. Emerging endoscopic techniques such as narrow band imaging (NBI), blue-laser imaging (BLI), and post-processing imaging techniques such as i-scan and flexible spectral imaging color enhancement have greatly increased the rate of esophageal cancer detection, as has endocytoscopy, which is a novel endoscopic system that provides high-quality assessment of lesions in vivo, but these depend upon specialized training and experience for the endoscopist[14,17,18]. Additionally, esophageal lesions frequently have irregular shapes and indistinct borders, resulting in variable performance even by experts due to the pressure to complete procedures quickly, limiting the time available for diagnosis and the degree of confidence in the interpretation[19].

Artificial intelligence (AI) in the medical industry is being utilized at an ever-increasing rate thanks to advances in deep learning (DL), one of its core branches[20-22]. Convolutional neural network (CNN) is a DL model inspired by the biological mechanism of object perception in the animal brain[23] with unique self-learning abilities that encode complex signals. After the original image is entered into the CNN model, the convolution layer automatically recognizes the color, texture, detailed features, and global features of the image according to the settings defined by the investigators. It then completes various diagnostic visual tasks, such as recognition of diabetic retinopathy or skin cancer[24,25]. CNN can also assist in the diagnosis of gastrointestinal diseases by preserving the spatial relationship characteristics of endoscopic images (including the detection of colorectal polyps), Helicobacter pylori infection, and gastrointestinal cancer[20,26-28]. Given the recent increasing use of endoscopy, CNN has been used extensively to diagnose esophageal cancer and premalignant lesions, as well as in predicting the invasion depth of esophageal cancer[29-31].

For this systematic review we conducted a meta-analysis of the diagnostic test accuracy (DTA) achieved by the CNN model in diagnosing esophageal cancer and HGD, as well as its ability to predict the invasion depth of esophageal cancer.

MATERIALS AND METHODS
Literature search

This systemic review and meta-analysis were conducted in accordance with the Preferred Reporting Items for Systematic Review and Meta-analyses guidelines. Two investigators independently searched the PubMed, EMBASE, Web of Science, and Cochrane Library databases for all studies published before November, 2022 that used the CNN model to detect esophageal cancer and HGD. The following terms were used for the search: (“convolutional neural network” OR “convolutional neural networks” OR “computer-aided” OR “computer aided” OR “artificial intelligence” OR “machine learning” OR “deep learning” OR “hierarchical learning” OR “computational intelligence” OR “machine intelligence” OR “computer reasoning”) AND (“esophageal neoplasms” OR “esophageal neoplasm” OR “esophagus neoplasm” OR “esophagus neoplasms” OR “esophagus cancer” OR “esophagus cancers” OR “esophageal cancer” OR “esophageal cancers” OR “oesophagus cancer” OR “oesophageal cancers” OR “oesophagus neoplasm” OR “oesophageal neoplasms” OR “esophageal squamous cell carcinoma” OR “adenocarcinoma of esophagus” OR “oesophageal squamous cell carcinoma” OR “esophageal adenocarcinoma” OR “oesophageal adenocarcinoma” OR “Barrett’s esophagus” OR “Barrett’s oesophagus”). Only English-language articles were included. The author screened all articles and emailed the research author to obtain missing data or study material before excluding any relevant articles from the analysis. Repetitive studies, reviews, and meta-analyses, as well as non-relevant studies (as determined by reading the title, abstract and full text), were excluded from this meta-analysis. Studies with insufficient information or that did not meet the inclusion criteria were excluded. Two authors discussed any differences, and sought advice from a third author to reconcile any differences.

Study selection

The inclusion criteria were: (1) Analysis of a CNN model utilizing still images or video to diagnose esophageal cancer or HGD; (2) Analysis of a CNN model for predicting the invasion depth of esophageal cancer or HGD, or for identifying intrapapillary capillary loops (IPCLs) of esophageal cancer and HGD; (3) Prospective or retrospective studies; (4) Cases of histologically-proven esophageal cancer and HGD; and (5) Studies published in English. The exclusion criteria were: (1) Reviews or meta-analyses; (2) Proceedings, letters or comments; (3) Experimental studies; (4) Animal studies; or (5) Studies with incomplete data.

Data extraction

Two authors independently extracted information from the identified reports, and resolved disagreements through extensive discussion to reach a consensus. The authors extracted the following information from each eligible study: First author, publication year, continent, scale (single center or multicenter), external validation (yes/no), study format, case type (image or patient), real-time (yes/no), histological type, image type, quality (see below), number of patients or endoscopic images, and algorithms for CNN models. The rates of true positivity, false positivity, false negativity, and true negativity for the CNN models and endoscopists in diagnosing esophageal cancer and HGD were also extracted, together with the prediction of invasion depth of esophageal cancer or HGD, or identification of IPCLs of esophageal cancer or HGD.

Quality assessment

The Quality for Assessment of Diagnostic Studies (QUADAS) score was used to determine the quality of the included studies. This score was assessed in four parts, comprising patient selection, index test, reference standard, and flow and timing, with the first three parts utilized for applicability assessment[32]. Each part was graded by two authors as having a high, low, or unclear risk of bias.

Outcome measures

The primary outcomes determined were the pooled diagnostic accuracy, pooled sensitivity, pooled specificity, positive likelihood ratio (PLR) and negative likelihood ratio (NLR) of the CNN models for diagnosing esophageal cancer and HGD, for predicting the invasion depth of esophageal cancer and HGD, and for identifying IPCLs in esophageal cancer or HGD. The area under the curve (AUC) was used to measure the accuracy of the CNN. The secondary outcome was the performance of endoscopists compared with that of CNN models for detection of esophageal cancer or HGD using the same still images and videos.

Statistical analyses

The main statistical data processing for this DTA meta-analysis used the bivariate method and the hierarchical summary receiver operating characteristic (HSROC) method to calculate the pooled sensitivity, pooled specificity, PLR, NLR, diagnostic odds ratio (DOR) and area under the receiver operating characteristic curve (AUROC) of CNN models and of endoscopists to detect esophageal cancer or HGD. This approach also considers the correlation between specificity and sensitivity. Heterogeneity was analyzed using the HSROC method to determine the correlation coefficient between logit-transformed sensitivity and specificity and the asymmetry parameter β. A β value of 0 serves as the standard for evaluating the symmetry of the ROC, with the HSROC curve inspected visually for signs of heterogeneity. Regression and subgroup analyses were used to determine the source of heterogeneity. The 95% confidence interval (CI) of AUROC was calculated and compared within each subgroup. A statistically significant difference between two subgroups was indicated by a non-overlapping 95%CI of the AUROC. STATA software version 15.1 (College Station, Texas, United States) with the installed packages MIDAS and METANDI was used to perform the main statistical analysis. Meta-DiSc 1.4 (XI Cochrane Colloquium, Barcelona, Spain) was used for the subgroup analysis of data with a small sample size. The figures for methodological quality assessment and the HSROC curve for small sample size data were drawn using RevMan 5.3 (The Nordic Cochrane Centre, Copenhagen, Denmark). Publication bias was analyzed using Deeks’ test[33]. A P value < 0.05 was considered statistically significant. The statistical methods of the study were reviewed by professor Ming-Cheng Li from Beihua University.

RESULTS
Literature search and screening results

A total of 2045 studies were identified initially using the screening search strategy described. Of these, 655 were excluded because they were duplicate studies, 133 because they were meta-analyses or reviews, and 1205 because they were deemed irrelevant based on their titles and abstracts. The remaining 52 studies were examined in full and 24 were rejected because they contained insufficient data, or were comments and proceedings. Finally, the authors identified 28 studies that met the inclusion and exclusion criteria for this systematic review and meta-analysis[17,20,29,31,34-57]. The flowchart for the search procedure is shown in Figure 1.

Figure 1
Figure 1 Flowchart of the search process.
Quality assessment of the included literature

The QUADAS-2 tool was used to assess the quality and bias risk of the included studies. Most studies were rated as having low bias risk in all parts. Of the 28 studies, only three failed to indicate whether the selected patients were continuous or chosen at random[41,44,45], hence their risk for patient selection was not clear. The studies included were of excellent quality, as shown in Figures 2A and B.

Figure 2
Figure 2 Methodological quality assessment. A: Summary graph of quality in the methodology; B: Summary table of quality in the methodology.
Meta-analysis of CNN for the diagnosis of esophageal cancer or HGD based on still images

A total of 19 still image-based studies assessed the diagnostic value of CNN for esophageal cancer or HGD[17,20,34-50], providing extractable data from a total of 20867 images. Twelve studies assessed images from Asian populations[17,20,34,37-39,45-50], five from European and American populations[35,41-44], and two from multiregional populations[36,40]. Thirteen studies provided CNN data using WLI[17,20,34,36,39,41-44,46-49] and 10 applied advanced imaging technologies, including NBI, BLI, etc.[17,20,35,37,38,40,42,45,48,50]. The histological types examined were sorted into three categories: Esophageal cancer (including ESCC and EAC)[17,20,34-38,40,41,46,47], Barrett’s neoplasia (including HGD and EAC)[39,42-45,49,50], and esophageal squamous cell neoplasia (including HGD and ESCC)[48]. One study diagnosed esophageal cancer IPCLs by CNN based on still images[38]. This study was also included in the meta-analysis. For the 19 still image-based studies describing the diagnosis of esophageal cancer or HGD, the pooled sensitivity was 0.95 (95%CI: 0.92-0.97), pooled specificity was 0.92 (0.89-0.94), PLR was 11.5 (8.3-16.0), NLR was 0.06 (0.04-0.09), DOR was 205 (115-365), and AUC was 0.98 (0.96-0.99) (Table 1). The studies included in this analysis showed heterogeneity (P = 0.000, I² = 0.98). However, the HSROC shape was symmetric, and the following results negated the impact of the threshold effect: A correlation coefficient between logit-transformed sensitivity and specificity of r = -0.225, and an asymmetric β parameter with a nonsignificant P value of 0.431 (Figure 3A).

Figure 3
Figure 3 Summary of the receiver operating characteristic, forest plots, and univariable meta-regression plot of convolutional neural network for the diagnosis of esophageal cancer or high-grade dysplasia based on still images. A: Summary of the receiver operating characteristic of convolutional neural network (CNN) for the diagnosis of esophageal cancer or high-grade dysplasia (HGD) based on still images; B: Coupled forest plots for the sensitivity and specificity of CNN in the diagnosis of esophageal cancer or HGD based on still images; C: Univariable meta-regression plot of CNN for the diagnosis of esophageal cancer or HGD based on still images. CI: Confidence interval; SROC: Summary receiver operating characteristic.
Table 1 Characteristics of the still image-based studies.
Ref.
Format
Scale
Continent
Case type
Architecture of CNN
Image type
Histological type
Real-time
External validation
Quality
Endoscopist control
Patients training set
Images training set
Patients test set
Images test set
TP
FP
FN
TN
Li et al[17], 2021RetrospectiveMulticenterAsiaImageVisual geometry groupNBI/WLIESCCNoNoHigh2064747351126322523714329
Ohmori et al[20], 2020RetrospectiveUnicenterAsiaPatientSSDNBI/BLIESCCNoNoHigh15NM225622377275116134
Cai et al[34], 2019RetrospectiveMulticenterAsiaImage8-layer convolutional neural networkWLIESCCNoNoHigh167462428521878914282
Ebigbo et al[35], 2019ProspectiveUnicenterEuropeImageResNetWLI/NBIEACNoNoHigh131132486274325136
Ghatwary et al[36], 2019RetrospectiveUnicenterPublicImageR-CNN, Fast R-CNN, Faster R-CNN, SSDWLIEACNoNoHighNo21NM39100484246
Kumagai et al[37], 2019RetrospectiveUnicenterAsiaPatientGoogLeNetECSESCCNoNoHighNo2404715551520253225
Zhao et al[38], 2019RetrospectiveUnicenterAsiaIPCLs imageImageNet VGG-16ME-NBIESCCNoNoHigh9NM261NM1383102333153174
Liu et al[39], 2020RetrospectiveUnicenterAsiaImageInception-ResNetWLIESCC/EACNoNoHighNoNM1017NM127274888
Guo et al[40], 2020RetrospectiveMulticenterPublicImageSegNetNBIESCCYesYesHighNo5496473212366711451258294933
Ebigbo et al[41], 2020RetrospectiveUnicenterEuropeImageResNetWLIEACYesNoLowNoNM1291462300626
Hashimoto et al[42], 2020RetrospectiveUnicenterAmeicaImageInception-ResNet v2NBI/WLIBarrett’s neoplasia (HGD/EAC)YesNoHighNo100183239458217138220
de Groof et al[43], 2020ProspectiveMulticenterEuropePatientResNet/U-NetWLIBarrett’s neoplasia (HGD/EAC)YesYesHigh53NM1544201442515896
de Groof et al[44], 2020RetrospectiveMulticenterEuropeImageResNet/U-NetWLIBarrett’s neoplasia (HGD/EAC)YesYesLow53157004956112554571863123217
Du et al[45], 2021RetrospectiveUnicenterAsiaImageDenseNetWLIESCC/EACNoNoLowNo325316771824419411061091032876
Tang et al[46], 2021RetrospectiveMulticenterAsiaImageResNet50WLIESCCYesYesHigh10107840022431033297876643
Yang et al[47], 2021RetrospectiveUnicenterAsiaImageYolo V3WLI/ME-OEESCCNoNoHigh6621532373NM1123263135774
Wang et al[48], 2021RetrospectiveUnicenterAsiaPatientSSDWLI/NBIESCN (HGD/ESCC)NoNoHighNo469362022641695226
Gong et al[49], 2022ProspectiveMulticenterAsiaImageGrad-CAMWLIESCC/EACNoYesHighNoNM4387NM16116315821901
Zhao et al[50], 2022RetrospectiveUnicenterAsiaPatientGoogLeNet-Inception V3NBIESCC/EACNoNoHigh2200NM100NM454546

A coupled forest plot of sensitivity and specificity is shown in Figure 3B. Meta-regression analysis of these data revealed that histological type was the only significant source of heterogeneity (P = 0.01) when the publication year (P = 0.26), continent (P = 0.65), scale (P = 0.61), external validation (P = 0.94), study type (P = 0.84), case type (P = 0.10), real-time (P = 0.90), image type (P = 0.07), quality (P = 0.10) and number of cases (P = 0.22) were included in the analysis. The data were also subjected to subgroup analysis (Table 2 and Figure 3C). Three still image-based studies compared the diagnostic performance of endoscopists with that of CNN models[20,34,50]. The CNN models showed higher sensitivity than did endoscopists [0.96 (95%CI: 0.92-0.98) vs 0.87 (95%CI: 0.81-0.91)] (Figure 4A-D). Using the same still image dataset, the diagnostic performance of the CNN models was marginally better than that of endoscopists, as shown by the plot of the HSROC curve (Figure 5).

Figure 4
Figure 4 Forest plots of convolutional neural network and endoscopist results for the diagnosis of esophageal cancer or high-grade dysplasia based on still images. A: Forest plot of the sensitivity by endoscopists for the diagnosis of esophageal cancer or high-grade dysplasia (HGD) based on still images; B: Forest plot of the specificity by endoscopists for the diagnosis of esophageal cancer or HGD based on still images; C: Forest plot of the sensitivity by convolutional neural network (CNN) for the diagnosis of esophageal cancer or HGD based on still images; D: Forest plot of the specificity by CNN for the diagnosis of esophageal cancer or HGD based on still images. CI: Confidence interval.
Figure 5
Figure 5 Summary of the receiver operating characteristic by convolutional neural network and endoscopists for the diagnosis of esophageal cancer or high-grade dysplasia based on still images. CNN: Convolutional neural network.
Table 2 Full detail and meta-analysis and subgroup analysis convolutional neural network model for the diagnosis of esophageal cancers or neoplasms in the still image-based analysis.

Number of studies
Sensitivity (95%CI)
Specificity (95%CI)
PLR (95%CI)
NLR (95%CI)
DOR (95%CI)
AUC (95%CI)
P value
CNN190.95 (0.92-0.97)0.92 (0.89-0.94)11.5 (8.3-16.0)0.06 (0.04-0.09)205 (115-365)0.98 (0.96-0.99)
Continent0.65
Asian120.95 (0.92-0.97)0.91 (0.87-0.95)11.1 (7.0-17.5)0.05 (0.03-0.09)222 (110-444)0.98 (0.96-0.99)
Europe/Ameica50.91 (0.86-0.94)0.90 (0.87-0.92)9.3 (7.0-12.3)0.10 (0.06-0.16)91 (45-186)0.95 (0.93-0.97)
Public2
Scale0.61
Unicenter120.94 (0.90-0.97)0.93 (0.88-0.96)13.2 (7.8-22.5)0.06 (0.03-0.11)219 (103-465)0.98 (0.96-0.99)
Multicenter70.95 (0.91-0.98)0.90 (0.87-0.93)10.0 (7.3-13.8)0.05 (0.03-0.10)191 (78-471)0.97 (0.95-0.98)
External validation or not0.94
External validation50.95 (0.88-0.98)0.91 (0.87-0.94)10.5 (6.9-16.0)0.06 (0.02-0.14)186 (55-635)0.97 (0.95-0.98)
No external validation140.95 (0.91-0.97)0.92 (0.88-0.95)12.1 (7.7-19.1)0.06 (0.03-0.09)213 (111-407)0.98 (0.96-0.99)
Format0.84
Retrospective160.95 (0.92-0.97)0.92 (0.88-0.95)12.0 (8.1-17.7)0.05 (0.03-0.09)223 (121-411)0.98 (0.96-0.99)
Prospective3
Case type0.1
Image140.95 (0.92-0.97)0.93 (0.90-0.95)13.7 (9.6-19.6)0.05 (0.03-0.09)252 (132-478)0.98 (0.96-0.99)
Patient50.95 (0.84-0.98)0.84 (0.75-0.90)5.8 (3.8-8.9)0.06 (0.02-0.19)94 (34-265)0.92 (0.90-0.94)
Real-time or not0.9
Real-time70.94 (0.88-0.97)0.91 (0.88-0.94)11.0 (7.6-16.0)0.06 (0.03-0.13)175 (65-471)0.96 (0.94-0.98)
No real-time120.95 (0.92-0.97)0.91 (0.87-0.95)11.1 (7.0-17.7)0.05 (0.03-0.09)210 (103-430)0.98 (0.96-0.99)
Histological type0.01
ESCN90.97 (0.94-0.98)0.90 (0.83-0.94)9.6 (5.6-16.3)0.04 (0.02-0.06)272 (106-699)0.98 (0.97-0.99)
Barrett’s neoplasia60.92 (0.85-0.96)0.91 (0.87-0.93)9.7 (6.7-14.1)0.09 (0.05-0.17)108 (43-272)0.96 (0.93-0.97)
ESCC/EAC40.92 (0.85-0.96)0.96 (0.94-0.97)23.0 (17.2-30.6)0.08 (0.04-0.16)283 (178-450)0.98 (0.96-0.99)
Image type0.07
WLI130.95 (0.91-0.97)0.89 (0.85-0.92)8.3 (6.2-11.0)0.06 (0.03-0.11)143 (75-273)0.96 (0.94-0.97)
Advanced imaging100.95 (0.91-0.97)0.93 (0.88-0.96)13.6 (7.5-24.6)0.06 (0.03-0.10)237 (107-525)0.98 (0.96-0.99)
Quality0.1
High160.96 (0.93-0.97)0.91 (0.88-0.94)10.7 (7.6-15.2)0.05 (0.03-0.08)223 (115-434)0.98 (0.96-0.99)
Low3
Meta-analysis of CNN models for the diagnosis of esophageal cancer or HGD based on videos

Eight video-based studies reported the diagnostic value of CNN for esophageal cancer or HGD[43,47,51-56], evaluating a total of 1262 videos. Six studies provided images from Asian populations[47,51,53-56] and two from European populations[43,52]. Five studies provided data using WLI[43,47,53-55] and five using advanced imaging technology including NBI, BLI, etc.[51-54,56]. The histological types analyzed in these studies included ESCC[47,51,53-56] and Barrett’s neoplasia (including HGD and EAC)[43,52]. Results from these eight video-based studies showed a pooled sensitivity of 0.85 (95%CI: 0.77-0.91), pooled specificity of 0.73 (0.59-0.83), PLR of 3.1 (1.9-5.0), NLR of 0.20 (0.12-0.34), DOR of 15 (6-38), and AUC of 0.87 (0.84-0.90) (Table 3).

Table 3 Characteristics of the still video-based studies.
Ref.
Format
Scale
Continent
Case type
Architecture of CNN
Image type
Histological type
Real-time
External validation
Quality
Endoscopist control
Patients training set
Videos training set
Patients test set
Videos test set
TP
FP
FN
TN
de Groof et al[43], 2020ProspectiveMulticenterEuropeVideoResNet/U-NetWLIBarrett’s neoplasia (HGD/EAC)YesYesHgh53NM154420209317
Yang et al[47], 2021RetrospectiveUnicenterAsiaVideoYolo V3WLIESCCNoNoHigh6621532373 image/104 videoNM68392126
Fukuda et al[51], 2020RetrospectiveUnicenterAsiaVideoSSD/VGG-16NBI/BLIESCCYesYesHigh13200228333NM23880531095
Struyvenberg et al[52], 2021RetrospectiveMulticenterEuropeVideoResNet/U-NetNBIBarrett’s neoplasia (HGD/EAC)YesYesHighNo15700495611504711415836236
Waki et al[53], 2021RetrospectiveMulticenterAsiaVideoResNet/ImageNetWLI/NBI/BLIESCCYesNoHigh21157218797113200103662334
Shiroma et al[54], 2021RetrospectiveUnicenterAsiaVideoSSDNBIESCCYesNoHigh18nm84284080114916
Yuan et al[55], 2022RetrospectiveMulticenterAsiaVideoYOLO v3WLIESCCYesYesHigh112621 image/19 video53933 image/142 videoNM38175214
Tajiri et al[56], 2022RetrospectiveUnicenterAsiaVideoResNet/ImageNetWLI/NBI/BLIESCCNoNoHigh1918432979413014771161248

The studies included in the video-based analysis exhibited heterogeneity (P = 0.000, I² = 0.93). Furthermore, the HSROC curve shape was symmetric (Figure 6A), the correlation coefficient between logit-transformed sensitivity and specificity was observed to be r = 0.277, and an asymmetric β parameter with a nonsignificant P value (0.630) was obtained. Thus, the observed heterogeneity was not due to the threshold effect. Coupled forest plots for sensitivity and specificity are shown in Figure 6B. Meta-regression and subgroup analyses revealed no obvious sources of heterogeneity (Table 4 and Figure 6C) from the publication year (P = 0.86), continent (P = 0.73), scale (P = 0.55), histological type (P = 0.73), external validation (P = 0.94), study type (P = 0.89), real-time (P = 0.13), image type (P = 0.76), or number of cases (P = 0.76). The type of case and quality were consistent among the eight studies, so no relevant meta-regression analysis was indicated. Because only one video-based study compared the diagnostic performance of the CNN model to that of endoscopists[53], no data analysis was performed for the endoscopists.

Figure 6
Figure 6 Summary of receiver operating characteristic, forest plots, and univariable meta-regression plot of convolutional neural network in the diagnosis of esophageal cancer or high-grade dysplasia based on videos. A: Summary of the receiver operating characteristic of convolutional neural network (CNN) for the diagnosis of esophageal cancer or high-grade dysplasia (HGD) based on videos; B: Coupled forest plots of sensitivity and specificity of CNN for the diagnosis of esophageal cancer or HGD based on videos; C: Univariable meta-regression plot of CNN for the diagnosis of esophageal cancer or HGD based on videos. CI: Confidence interval; SROC: Summary receiver operating characteristic.
Table 4 Full detail and meta-analysis and subgroup analysis convolutional neural network model for the diagnosis of esophageal cancers or neoplasms in the video-based analysis.

Number of studies
Sensitivity (95%CI)
Specificity (95%CI)
PLR (95%CI)
NLR (95%CI)
DOR (95%CI)
AUC (95%CI)
P value
CNN0.85 (0.77-0.91)0.73 (0.59-0.83)3.1 (1.9-5.0)0.20 (0.12-0.34)15 (6-38)0.87 (0.84-0.90)
Continent0.73
Asian60.86 (0.76-0.93)0.71 (0.53-0.85)3.0 (1.6-5.5)0.19 (0.09-0.40)16 (5-54)0.87 (0.84-0.90)
Europe/Ameica2
Scale0.55
Unicenter40.87 (0.68-0.96)0.77 (0.62-0.87)3.8 (2.0-7.0)0.17 (0.06-0.49)23 (5-106)0.87 (0.84-0.90)
Multicenter40.81 (0.77-0.85)0.65 (0.43-0.82) 2.3 (1.3-4.2)0.29 (0.20-0.41)8 (3-20)0.82 (0.78-0.85)
External validation or not0.94
External validation0.85 (0.78-0.91)0.73 (0.63-0.80)3.1 (2.4-4.1)0.20 (0.14-0.29)16 (10-24)0.87 (0.84-0.90)
No external validation0.85 (0.66-0.94)0.74 (0.45-0.90)3.2 (1.2-8.5)0.20 (0.07-0.60)16 (2-106)0.87 (0.84-0.90)
Format0.89
Retrospective50.85 (0.76-0.91)0.73 (0.58-0.84)3.1 (1.9-5.3)0.21 (0.12-0.36)15 (6-41)0.87 (0.84-0.90)
Prospective1
Real-time or not0.13
Real-time60.82 (0.74-0.87)0.68 (0.52-0.80)2.5 (1.6-3.9)0.27 (0.19-0.39)9 (5-18)0.83 (0.80-0.86)
No real-time2
Histological type0.73
ESCN60.86 (0.76-0.93)0.71 (0.53-0.85)3.0 (1.6-5.5)0.19 (0.09-0.40)16 (5-54)0.87 (0.84-0.90)
Barrett’s neoplasia2
Image type0.76
WLI40.83 (0.71-0.91)0.49 (0.27-0.71)1.6 (0.9-2.8)0.34 (0.13-0.88)5 (1-20)0.80 (0.77-0.84)
Advanced imaging50.83 (0.77-0.88)0.71 (0.56-0.82)2.9 (1.9-4.3)0.24 (0.19-0.30)12 (8-19)0.86 (0.82-0.88)
Meta-analysis of CNN for predicting the invasion depth of esophageal cancer

Three studies used the CNN model to predict the invasion depth of esophageal cancer and gave precise data[29,31,57]. One differentiated between pathological intraepithelial (pEP)-submucosal microinvasive (SM1) (pEP-SM1) and pathological submucosal deep invasive (pSM2/3) cancers[29], one reported the diagnostic performance of CNN for pEP-SM1 and pEP-muscularis mucosa cancer[31], and one reported the diagnostic performance of CNN for pEP-SM1 cancer[57]. The pooled sensitivity was 0.90 (95%CI: 0.87-0.92), pooled specificity was 0.83 (0.76-0.88), PLR was 7.8 (1.9-32.0), NLR was 0.10 (0.41-0.25), DOR was 117.76 (10.63-1304.7), and AUC was 0.95 (0.92-0.96) (Table 5). The HSROC curve and coupled forest plots of sensitivity and specificity are shown in Figures 7A-C, respectively. Two studies compared the diagnostic performance of endoscopists vs CNN models for predicting the invasion depth of esophageal cancer[31,57]. However, because only one of these provided specific data, an analysis was not performed on the diagnostic performance of the endoscopists.

Figure 7
Figure 7 Summary of receiver operating characteristic and forest plots for convolutional neural network in predicting the invasion depth of esophageal cancer. A: Summary of receiver operating characteristic for convolutional neural network (CNN) in predicting the invasion depth of esophageal cancer; B: Forest plots of sensitivity for CNN in predicting the invasion depth of esophageal cancer; C: Forest plots of specificity for CNN in predicting the invasion depth of esophageal cancer. AUC: Area under the curve; SROC: Summary receiver operating characteristic; CI: Confidence interval.
Table 5 Characteristics of the studies about diagnosis of invasion depth of esophageal cancers.
Ref.
Format
Scale
Continent
Depth
Architecture of CNN
Image type
Histological type
Real-time
External validation
Quality
Endoscopist control
Patients training set
Images training set
Patients test set
Images test set
TP
FP
FN
TN
Horie et al[29], 2019RetrospectiveUnicenterAsiaT1a, T1b vs T2-4SSDWLI/NBIESCC/EACYesNoHighNo3848428NM1681422123
Nakagawa et al[31], 2019RetrospectiveUnicenterAsiapEP-SM1, pEP-MMSSDWLI/NBI/BLIESCCNoNoHigh16804143381559147142460132
Tokai et al[57], 2020RetrospectiveUnicenterAsiapEP-SM1SSDNBI/WLIESCCNoNoHigh13NM10179NM279159243066
Evaluation of publication bias

Deeks’ funnel plot of 19 still image-based studies showed a symmetrical shape with respect to the regression line (Figure 8A). The asymmetric test revealed no significant publication bias (P = 0.07). Furthermore, Deeks’ funnel plot of eight video-based studies also showed a symmetrical shape with respect to the regression line (Figure 8B), with no significant publication bias (P = 0.55).

Figure 8
Figure 8 Deeks’ plot of publication bias. A Deek’s funnel plot of convolutional neural network (CNN) for the diagnosis of esophageal cancer or high-grade dysplasia (HGD) based on still images; B: Deek’s funnel plot of CNN for the diagnosis of esophageal cancer or HGD based on videos.
DISCUSSION

Esophageal cancer is a malignant neoplasm with early, rapid metastasis and a poor prognosis, but endoscopy can provide early diagnosis and therapy[58]. Endoscopists can find it challenging to accurately diagnose esophageal cancer and HGD when relying solely on their own skills, but AI may have clinical applicability to achieve greater accuracy[59]. CNN is a branch of DL that uses a special learning method to develop image recognition capabilities through training datasets. Recently, CNN has been applied to the analysis of endoscopic images and videos, showing rapid progress and developing progressively into a crucial auxiliary tool for endoscopists[60]. Additionally, CNN has been used to recognize the geometry of IPCLs to gauge the invasion depth of esophageal cancer, as well as to help medical professionals build treatment regimens[61,62]. This systematic review and meta-analysis demonstrates that the CNN method can reliably identify esophageal cancer and HGD, providing great clinical applicability. The current meta-analysis found that CNN was effective at identifying esophageal cancer based on still image data, with values for pooled sensitivity, pooled specificity, PLR, NLR, DOR and AUC of 0.95, 0.92, 11.5, 0.06, 205, and 0.98, respectively. The still image dataset demonstrates the ability of CNN to identify uncertain lesions discovered during endoscopy, and CNN showed higher sensitivity than endoscopists. It might therefore reduce the rate of missed diagnosis of esophageal cancers and neoplasms and help endoscopists find lesions that are easily overlooked.

The meta-analysis of video data revealed that the CNN model performs exceptionally well for the diagnosis of esophageal cancer and HGD. The pooled sensitivity, pooled specificity, PLR, NLR, DOR and AUC were 0.85, 0.73, 3.1, 0.20, 15, and 0.87, respectively. Despite having good diagnostic performance, the meta-analysis results for the CNN model based on video data showed that CNN was slightly less accurate when used on video images than on static images as the dataset. This reduced accuracy may arise because the performance of CNN on video images is influenced by a variety of factors, including poor insufflation, bleeding, blurring, focus, angle, surgical procedure, patient participation, and image quality. However, video more accurately mimics the endoscopic procedure performed by the endoscopist, which serves as a valuable benchmark for the operational performance of CNN. Esophageal lesions can be overlooked due to the endoscope passing through the esophagus too quickly, or because of insufficient expertise by the endoscopist. CNN can assist endoscopists to correctly identify and further diagnose esophageal lesions. Refinement and expansion of the training dataset should improve CNN performance in the identification of video-based lesions[37,38].

The robustness of the diagnostic performance of the CNN model can be seen in the subgroup analysis, in that no appreciable differences in its performance were observed across different subgroups. Moreover, the diagnostic efficacy of the CNN model did not differ significantly according to continent, histology, or case type. Thus, we conclude that CNN based on still images can be applied to a wide range of gastrointestinal diseases and endoscopic functions[63,64]. Importantly, CNN models based on WLI and other advanced imaging modalities show similarly excellent diagnostic performance. Advanced imaging modes such as NBI and BLI can improve detection of the surface structure and microvascular morphology of lesions, which is one of the standard ways to diagnose esophageal cancer. It is worth noting that the more advanced endocytoscopy can recognize the histological structure of the pre-cancer epithelium with the help of intraprocedural coloration, so called “virtual histology”[65]. Application of the CNN model to these methods may compensate for interobserver variability[20,66].

Advances in real-time diagnostic capabilities have also increased the importance of CNN in clinical practice. CNN requires a recognition speed of at least 25 frames per second, while current methods can frequently achieve more than 30 to 60 frames/s without a latency period[52,54,55]. The identification speed of CNN may therefore reduce the time needed for diagnosis and increase the speed of endoscopic procedures. Determining diagnosis and selecting the appropriate treatment strategy depend upon accurate endoscopic prediction of the invasion depth of esophageal cancer. Endoscopic resection should be the treatment of choice in esophageal lesions that affect only the EP-SM1 because there is a low chance of lymph node metastases for this extent of disease. Lesions that invade SM2-SM should be removed surgically or with chemoradiotherapy due to the increased risk of lymph node metastasis[11,67,68]. Based on this meta-analysis, the CNN model is ideally suited for predicting the invasion depth of esophageal cancer, with a pooled sensitivity of 0.90 (95%CI: 0.88-0.93), pooled specificity of 0.83 (0.76-0.88), and AUC of 0.95 (0.92-0.96). Two prior studies compared the diagnostic performance of endoscopists to that of CNN models for predicting the invasion depth of esophageal cancer. Tokai et al[57] concluded the CNN model was more accurate than were endoscopists, while Nakagawa et al[31] reported that CNN performed similarly to experienced endoscopists. Morphological changes in IPCLs, which are microvascular structures on the surface of esophageal cancer, are closely associated with the invasion depth of the tumor. Only one of the studies examined reported the diagnostic performance of CNN for identifying IPCLs in esophageal cancer[38]. Using a CNN model based on still images, this study found a mean diagnostic precision of 89.2% at the lesion level and 93.0% at the pixel level.

The present DTA meta-analysis has demonstrated the powerful detection efficiency of the CNN model for esophageal cancers and neoplasms. This analysis has several limitations that should be considered. First, the included studies did not contain sufficient information to allow evaluation of the overall diagnostic accuracy of endoscopists or endocytoscopy. Second, despite a recent increase in the number of CNN studies that predict the invasion depth of esophageal cancer, only three studies met our inclusion and exclusion criteria to be included in this meta-analysis. Accurate prediction of the invasion depth, which is the foundation for early diagnosis and treatment, is vital for the further development of CNN for this disease. Third, there was insufficient data to allow comparison of the diagnostic abilities of endoscopic physicians with CNN models. Although the majority of current studies reported high diagnostic accuracy for the CNN model, some aspects, such as the use of video datasets and prediction of the invasion depth, require additional supporting evidence. Fourth, the CNN training procedure used in this meta-analysis has not been standardized, and the training dataset cannot be recorded or used in subgroup analysis. A large, multi-center cohort analysis is indicated to validate the use of CNN for esophageal cancer and HGD, and to compare its diagnostic ability with that of endoscopists. Follow-up studies that use the same video datasets are also needed.

CONCLUSION

In conclusion, the CNN model has excellent potential for accurately diagnosing esophageal cancers and HGD. It is anticipated to develop into an important diagnostic tool for endoscopists, showing promise for predicting the invasion depth of esophageal cancer.

ARTICLE HIGHLIGHTS
Research background

The development of convolutional neural network (CNN) model as a novel diagnostic technology has promoted the screening, early detection, and improved prognosis of esophageal cancer and high-grade dysplasia (HGD).

Research motivation

Explore the diagnostic value of CNN model for esophageal cancer and HGD, and provide basis for its clinical application.

Research objectives

Conduct a meta-analysis of the diagnostic accuracy of CNN models for the diagnosis of esophageal cancer and HGD.

Research methods

We searched for relevant studies in various search engines, evaluated the diagnostic accuracy of CNN models, and calculated the diagnostic test accuracy with a bivariate method and hierarchical summary receiver operating characteristic method. Meta-regression and subgroup analyses were used to identify sources of heterogeneity.

Research results

After processing 28 items of still image-based and video-based analysis in statistics, CNN models have been proven to have high accuracy and diagnostic efficiency in diagnosing esophageal cancer or HGD and predicting the invasion depth of esophageal cancer.

Research conclusions

CNN-based image analysis in diagnosing esophageal cancer and HGD is an excellent diagnostic method with high sensitivity and specificity.

Research perspectives

A thorough evaluation of the accuracy of diagnosis in esophageal cancer and HGD requires further investigation. Large-scale trials are needed to assess performance and predict clinical values.

ACKNOWLEDGEMENTS

We thank anonymous reviewers for excellent criticism of the article.

Footnotes

Provenance and peer review: Unsolicited article; Externally peer reviewed.

Peer-review model: Single blind

Specialty type: Oncology

Country/Territory of origin: China

Peer-review report’s scientific quality classification

Grade A (Excellent): 0

Grade B (Very good): B

Grade C (Good): C

Grade D (Fair): 0

Grade E (Poor): 0

P-Reviewer: Sawadogo W, United States; Sumi K, Japan S-Editor: Wang JJ L-Editor: A P-Editor: Yuan YY

References
1.  Sung H, Ferlay J, Siegel RL, Laversanne M, Soerjomataram I, Jemal A, Bray F. Global Cancer Statistics 2020: GLOBOCAN Estimates of Incidence and Mortality Worldwide for 36 Cancers in 185 Countries. CA Cancer J Clin. 2021;71:209-249.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 50630]  [Cited by in F6Publishing: 47944]  [Article Influence: 15981.3]  [Reference Citation Analysis (47)]
2.  Pennathur A, Gibson MK, Jobe BA, Luketich JD. Oesophageal carcinoma. Lancet. 2013;381:400-412.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1629]  [Cited by in F6Publishing: 1824]  [Article Influence: 165.8]  [Reference Citation Analysis (4)]
3.  Arnold M, Soerjomataram I, Ferlay J, Forman D. Global incidence of oesophageal cancer by histological subtype in 2012. Gut. 2015;64:381-387.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 944]  [Cited by in F6Publishing: 953]  [Article Influence: 105.9]  [Reference Citation Analysis (0)]
4.  Castro C, Bosetti C, Malvezzi M, Bertuccio P, Levi F, Negri E, La Vecchia C, Lunet N. Patterns and trends in esophageal cancer mortality and incidence in Europe (1980-2011) and predictions to 2015. Ann Oncol. 2014;25:283-290.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 92]  [Cited by in F6Publishing: 100]  [Article Influence: 10.0]  [Reference Citation Analysis (0)]
5.  Edgren G, Adami HO, Weiderpass E, Nyrén O. A global assessment of the oesophageal adenocarcinoma epidemic. Gut. 2013;62:1406-1414.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 248]  [Cited by in F6Publishing: 251]  [Article Influence: 22.8]  [Reference Citation Analysis (0)]
6.  Morales CP, Souza RF, Spechler SJ. Hallmarks of cancer progression in Barrett's oesophagus. Lancet. 2002;360:1587-1589.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 103]  [Cited by in F6Publishing: 100]  [Article Influence: 4.5]  [Reference Citation Analysis (0)]
7.  Jemal A, Siegel R, Ward E, Murray T, Xu J, Thun MJ. Cancer statistics, 2007. CA Cancer J Clin. 2007;57:43-66.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 6112]  [Cited by in F6Publishing: 5934]  [Article Influence: 349.1]  [Reference Citation Analysis (0)]
8.  Solanky D, Krishnamoorthi R, Crews N, Johnson M, Wang K, Wolfsen H, Fleischer D, Ramirez FC, Katzka D, Buttar N, Iyer PG. Barrett Esophagus Length, Nodularity, and Low-grade Dysplasia are Predictive of Progression to Esophageal Adenocarcinoma. J Clin Gastroenterol. 2019;53:361-365.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 25]  [Cited by in F6Publishing: 23]  [Article Influence: 4.6]  [Reference Citation Analysis (0)]
9.  Siersema PD. Esophageal cancer. Gastroenterol Clin North Am. 2008;37:943-964, x.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 21]  [Cited by in F6Publishing: 25]  [Article Influence: 1.6]  [Reference Citation Analysis (0)]
10.  Wang GQ, Jiao GG, Chang FB, Fang WH, Song JX, Lu N, Lin DM, Xie YQ, Yang L. Long-term results of operation for 420 patients with early squamous cell esophageal carcinoma discovered by screening. Ann Thorac Surg. 2004;77:1740-1744.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 119]  [Cited by in F6Publishing: 140]  [Article Influence: 7.0]  [Reference Citation Analysis (1)]
11.  Kitagawa Y, Uno T, Oyama T, Kato K, Kato H, Kawakubo H, Kawamura O, Kusano M, Kuwano H, Takeuchi H, Toh Y, Doki Y, Naomoto Y, Nemoto K, Booka E, Matsubara H, Miyazaki T, Muto M, Yanagisawa A, Yoshida M. Correction to: Esophageal cancer practice guidelines 2017 edited by the Japan Esophageal Society: part 1 and Part 2. Esophagus. 2022;19:726.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1]  [Cited by in F6Publishing: 3]  [Article Influence: 1.5]  [Reference Citation Analysis (0)]
12.  Pimentel-Nunes P, Dinis-Ribeiro M, Ponchon T, Repici A, Vieth M, De Ceglie A, Amato A, Berr F, Bhandari P, Bialek A, Conio M, Haringsma J, Langner C, Meisner S, Messmann H, Morino M, Neuhaus H, Piessevaux H, Rugge M, Saunders BP, Robaszkiewicz M, Seewald S, Kashin S, Dumonceau JM, Hassan C, Deprez PH. Endoscopic submucosal dissection: European Society of Gastrointestinal Endoscopy (ESGE) Guideline. Endoscopy. 2015;47:829-854.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 817]  [Cited by in F6Publishing: 863]  [Article Influence: 95.9]  [Reference Citation Analysis (0)]
13.  Behrens A, Pech O, Graupe F, May A, Lorenz D, Ell C. Barrett's adenocarcinoma of the esophagus: better outcomes through new methods of diagnosis and treatment. Dtsch Arztebl Int. 2011;108:313-319.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 8]  [Cited by in F6Publishing: 11]  [Article Influence: 0.8]  [Reference Citation Analysis (0)]
14.  Codipilly DC, Qin Y, Dawsey SM, Kisiel J, Topazian M, Ahlquist D, Iyer PG. Screening for esophageal squamous cell carcinoma: recent advances. Gastrointest Endosc. 2018;88:413-426.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 110]  [Cited by in F6Publishing: 163]  [Article Influence: 27.2]  [Reference Citation Analysis (0)]
15.  Fagundes RB, de Barros SG, Pütten AC, Mello ES, Wagner M, Bassi LA, Bombassaro MA, Gobbi D, Souto EB. Occult dysplasia is disclosed by Lugol chromoendoscopy in alcoholics at high risk for squamous cell carcinoma of the esophagus. Endoscopy. 1999;31:281-285.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 73]  [Cited by in F6Publishing: 58]  [Article Influence: 2.3]  [Reference Citation Analysis (0)]
16.  Kondo H, Fukuda H, Ono H, Gotoda T, Saito D, Takahiro K, Shirao K, Yamaguchi H, Yoshida S. Sodium thiosulfate solution spray for relief of irritation caused by Lugol's stain in chromoendoscopy. Gastrointest Endosc. 2001;53:199-202.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 78]  [Cited by in F6Publishing: 87]  [Article Influence: 3.8]  [Reference Citation Analysis (0)]
17.  Li B, Cai SL, Tan WM, Li JC, Yalikong A, Feng XS, Yu HH, Lu PX, Feng Z, Yao LQ, Zhou PH, Yan B, Zhong YS. Comparative study on artificial intelligence systems for detecting early esophageal squamous cell carcinoma between narrow-band and white-light imaging. World J Gastroenterol. 2021;27:281-293.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in CrossRef: 17]  [Cited by in F6Publishing: 17]  [Article Influence: 5.7]  [Reference Citation Analysis (0)]
18.  Shin D, Protano MA, Polydorides AD, Dawsey SM, Pierce MC, Kim MK, Schwarz RA, Quang T, Parikh N, Bhutani MS, Zhang F, Wang G, Xue L, Wang X, Xu H, Anandasabapathy S, Richards-Kortum RR. Quantitative analysis of high-resolution microendoscopic images for diagnosis of esophageal squamous cell carcinoma. Clin Gastroenterol Hepatol. 2015;13:272-279.e2.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 59]  [Cited by in F6Publishing: 63]  [Article Influence: 7.0]  [Reference Citation Analysis (0)]
19.  Shibagaki K, Amano Y, Ishimura N, Taniguchi H, Fujita H, Adachi S, Kakehi E, Fujita R, Kobayashi K, Kinoshita Y. Diagnostic accuracy of magnification endoscopy with acetic acid enhancement and narrow-band imaging in gastric mucosal neoplasms. Endoscopy. 2016;48:16-25.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 2]  [Cited by in F6Publishing: 8]  [Article Influence: 1.0]  [Reference Citation Analysis (0)]
20.  Ohmori M, Ishihara R, Aoyama K, Nakagawa K, Iwagami H, Matsuura N, Shichijo S, Yamamoto K, Nagaike K, Nakahara M, Inoue T, Aoi K, Okada H, Tada T. Endoscopic detection and differentiation of esophageal lesions using a deep neural network. Gastrointest Endosc. 2020;91:301-309.e1.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 74]  [Cited by in F6Publishing: 72]  [Article Influence: 18.0]  [Reference Citation Analysis (0)]
21.  Sheth D, Giger ML. Artificial intelligence in the interpretation of breast cancer on MRI. J Magn Reson Imaging. 2020;51:1310-1324.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 66]  [Cited by in F6Publishing: 82]  [Article Influence: 16.4]  [Reference Citation Analysis (0)]
22.  Shkolyar E, Jia X, Chang TC, Trivedi D, Mach KE, Meng MQ, Xing L, Liao JC. Augmented Bladder Tumor Detection Using Deep Learning. Eur Urol. 2019;76:714-718.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 64]  [Cited by in F6Publishing: 82]  [Article Influence: 16.4]  [Reference Citation Analysis (0)]
23.  Chartrand G, Cheng PM, Vorontsov E, Drozdzal M, Turcotte S, Pal CJ, Kadoury S, Tang A. Deep Learning: A Primer for Radiologists. Radiographics. 2017;37:2113-2131.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 511]  [Cited by in F6Publishing: 619]  [Article Influence: 103.2]  [Reference Citation Analysis (0)]
24.  Babenko B, Mitani A, Traynis I, Kitade N, Singh P, Maa AY, Cuadros J, Corrado GS, Peng L, Webster DR, Varadarajan A, Hammel N, Liu Y. Detection of signs of disease in external photographs of the eyes via deep learning. Nat Biomed Eng. 2022;6:1370-1383.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 7]  [Cited by in F6Publishing: 20]  [Article Influence: 10.0]  [Reference Citation Analysis (0)]
25.  Esteva A, Kuprel B, Novoa RA, Ko J, Swetter SM, Blau HM, Thrun S. Dermatologist-level classification of skin cancer with deep neural networks. Nature. 2017;542:115-118.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 5683]  [Cited by in F6Publishing: 4780]  [Article Influence: 682.9]  [Reference Citation Analysis (0)]
26.  Hirasawa T, Aoyama K, Tanimoto T, Ishihara S, Shichijo S, Ozawa T, Ohnishi T, Fujishiro M, Matsuo K, Fujisaki J, Tada T. Application of artificial intelligence using a convolutional neural network for detecting gastric cancer in endoscopic images. Gastric Cancer. 2018;21:653-660.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 389]  [Cited by in F6Publishing: 385]  [Article Influence: 64.2]  [Reference Citation Analysis (0)]
27.  Itoh T, Kawahira H, Nakashima H, Yata N. Deep learning analyzes Helicobacter pylori infection by upper gastrointestinal endoscopy images. Endosc Int Open. 2018;6:E139-E144.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 116]  [Cited by in F6Publishing: 117]  [Article Influence: 19.5]  [Reference Citation Analysis (0)]
28.  Urban G, Tripathi P, Alkayali T, Mittal M, Jalali F, Karnes W, Baldi P. Deep Learning Localizes and Identifies Polyps in Real Time With 96% Accuracy in Screening Colonoscopy. Gastroenterology. 2018;155:1069-1078.e8.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 398]  [Cited by in F6Publishing: 388]  [Article Influence: 64.7]  [Reference Citation Analysis (1)]
29.  Horie Y, Yoshio T, Aoyama K, Yoshimizu S, Horiuchi Y, Ishiyama A, Hirasawa T, Tsuchida T, Ozawa T, Ishihara S, Kumagai Y, Fujishiro M, Maetani I, Fujisaki J, Tada T. Diagnostic outcomes of esophageal cancer by artificial intelligence using convolutional neural networks. Gastrointest Endosc. 2019;89:25-32.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 240]  [Cited by in F6Publishing: 231]  [Article Influence: 46.2]  [Reference Citation Analysis (0)]
30.  Huang LM, Yang WJ, Huang ZY, Tang CW, Li J. Artificial intelligence technique in detection of early esophageal cancer. World J Gastroenterol. 2020;26:5959-5969.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in CrossRef: 16]  [Cited by in F6Publishing: 10]  [Article Influence: 2.5]  [Reference Citation Analysis (0)]
31.  Nakagawa K, Ishihara R, Aoyama K, Ohmori M, Nakahira H, Matsuura N, Shichijo S, Nishida T, Yamada T, Yamaguchi S, Ogiyama H, Egawa S, Kishida O, Tada T. Classification for invasion depth of esophageal squamous cell carcinoma using a deep neural network compared with experienced endoscopists. Gastrointest Endosc. 2019;90:407-414.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 82]  [Cited by in F6Publishing: 85]  [Article Influence: 17.0]  [Reference Citation Analysis (0)]
32.  Whiting PF, Rutjes AW, Westwood ME, Mallett S, Deeks JJ, Reitsma JB, Leeflang MM, Sterne JA, Bossuyt PM; QUADAS-2 Group. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011;155:529-536.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 6953]  [Cited by in F6Publishing: 8392]  [Article Influence: 645.5]  [Reference Citation Analysis (0)]
33.  Deeks JJ, Macaskill P, Irwig L. The performance of tests of publication bias and other sample size effects in systematic reviews of diagnostic test accuracy was assessed. J Clin Epidemiol. 2005;58:882-893.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 1792]  [Cited by in F6Publishing: 2059]  [Article Influence: 108.4]  [Reference Citation Analysis (1)]
34.  Cai SL, Li B, Tan WM, Niu XJ, Yu HH, Yao LQ, Zhou PH, Yan B, Zhong YS. Using a deep learning system in endoscopy for screening of early esophageal squamous cell carcinoma (with video). Gastrointest Endosc. 2019;90:745-753.e2.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 74]  [Cited by in F6Publishing: 83]  [Article Influence: 16.6]  [Reference Citation Analysis (0)]
35.  Ebigbo A, Mendel R, Probst A, Manzeneder J, Souza LA Jr, Papa JP, Palm C, Messmann H. Computer-aided diagnosis using deep learning in the evaluation of early oesophageal adenocarcinoma. Gut. 2019;68:1143-1145.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 94]  [Cited by in F6Publishing: 96]  [Article Influence: 19.2]  [Reference Citation Analysis (0)]
36.  Ghatwary N, Zolgharni M, Ye X. Early esophageal adenocarcinoma detection using deep learning methods. Int J Comput Assist Radiol Surg. 2019;14:611-621.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 38]  [Cited by in F6Publishing: 52]  [Article Influence: 10.4]  [Reference Citation Analysis (0)]
37.  Kumagai Y, Takubo K, Kawada K, Aoyama K, Endo Y, Ozawa T, Hirasawa T, Yoshio T, Ishihara S, Fujishiro M, Tamaru JI, Mochiki E, Ishida H, Tada T. Diagnosis using deep-learning artificial intelligence based on the endocytoscopic observation of the esophagus. Esophagus. 2019;16:180-187.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 65]  [Cited by in F6Publishing: 56]  [Article Influence: 11.2]  [Reference Citation Analysis (0)]
38.  Zhao YY, Xue DX, Wang YL, Zhang R, Sun B, Cai YP, Feng H, Cai Y, Xu JM. Computer-assisted diagnosis of early esophageal squamous cell carcinoma using narrow-band imaging magnifying endoscopy. Endoscopy. 2019;51:333-341.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 66]  [Cited by in F6Publishing: 72]  [Article Influence: 14.4]  [Reference Citation Analysis (0)]
39.  Liu G, Hua J, Wu Z, Meng T, Sun M, Huang P, He X, Sun W, Li X, Chen Y. Automatic classification of esophageal lesions in endoscopic images using a convolutional neural network. Ann Transl Med. 2020;8:486.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 23]  [Cited by in F6Publishing: 23]  [Article Influence: 5.8]  [Reference Citation Analysis (0)]
40.  Guo L, Xiao X, Wu C, Zeng X, Zhang Y, Du J, Bai S, Xie J, Zhang Z, Li Y, Wang X, Cheung O, Sharma M, Liu J, Hu B. Real-time automated diagnosis of precancerous lesions and early esophageal squamous cell carcinoma using a deep learning model (with videos). Gastrointest Endosc. 2020;91:41-51.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 108]  [Cited by in F6Publishing: 116]  [Article Influence: 29.0]  [Reference Citation Analysis (0)]
41.  Ebigbo A, Mendel R, Probst A, Manzeneder J, Prinz F, de Souza LA Jr, Papa J, Palm C, Messmann H. Real-time use of artificial intelligence in the evaluation of cancer in Barrett's oesophagus. Gut. 2020;69:615-616.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 84]  [Cited by in F6Publishing: 101]  [Article Influence: 25.3]  [Reference Citation Analysis (0)]
42.  Hashimoto R, Requa J, Dao T, Ninh A, Tran E, Mai D, Lugo M, El-Hage Chehade N, Chang KJ, Karnes WE, Samarasena JB. Artificial intelligence using convolutional neural networks for real-time detection of early esophageal neoplasia in Barrett's esophagus (with video). Gastrointest Endosc. 2020;91:1264-1271.e1.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 102]  [Cited by in F6Publishing: 116]  [Article Influence: 29.0]  [Reference Citation Analysis (0)]
43.  de Groof AJ, Struyvenberg MR, Fockens KN, van der Putten J, van der Sommen F, Boers TG, Zinger S, Bisschops R, de With PH, Pouw RE, Curvers WL, Schoon EJ, Bergman JJGHM. Deep learning algorithm detection of Barrett's neoplasia with high accuracy during live endoscopic procedures: a pilot study (with video). Gastrointest Endosc. 2020;91:1242-1250.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 66]  [Cited by in F6Publishing: 66]  [Article Influence: 16.5]  [Reference Citation Analysis (0)]
44.  de Groof AJ, Struyvenberg MR, van der Putten J, van der Sommen F, Fockens KN, Curvers WL, Zinger S, Pouw RE, Coron E, Baldaque-Silva F, Pech O, Weusten B, Meining A, Neuhaus H, Bisschops R, Dent J, Schoon EJ, de With PH, Bergman JJ. Deep-Learning System Detects Neoplasia in Patients With Barrett's Esophagus With Higher Accuracy Than Endoscopists in a Multistep Training and Validation Study With Benchmarking. Gastroenterology. 2020;158:915-929.e4.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 172]  [Cited by in F6Publishing: 182]  [Article Influence: 45.5]  [Reference Citation Analysis (0)]
45.  Du W, Rao N, Dong C, Wang Y, Hu D, Zhu L, Zeng B, Gan T. Automatic classification of esophageal disease in gastroscopic images using an efficient channel attention deep dense convolutional neural network. Biomed Opt Express. 2021;12:3066-3081.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 16]  [Cited by in F6Publishing: 18]  [Article Influence: 6.0]  [Reference Citation Analysis (0)]
46.  Tang D, Wang L, Jiang J, Liu Y, Ni M, Fu Y, Guo H, Wang Z, An F, Zhang K, Hu Y, Zhan Q, Xu G, Zou X. A Novel Deep Learning System for Diagnosing Early Esophageal Squamous Cell Carcinoma: A Multicenter Diagnostic Study. Clin Transl Gastroenterol. 2021;12:e00393.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 11]  [Cited by in F6Publishing: 4]  [Article Influence: 1.3]  [Reference Citation Analysis (0)]
47.  Yang XX, Li Z, Shao XJ, Ji R, Qu JY, Zheng MQ, Sun YN, Zhou RC, You H, Li LX, Feng J, Yang XY, Li YQ, Zuo XL. Real-time artificial intelligence for endoscopic diagnosis of early esophageal squamous cell cancer (with video). Dig Endosc. 2021;33:1075-1084.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 14]  [Cited by in F6Publishing: 24]  [Article Influence: 8.0]  [Reference Citation Analysis (0)]
48.  Wang YK, Syu HY, Chen YH, Chung CS, Tseng YS, Ho SY, Huang CW, Wu IC, Wang HC. Endoscopic Images by a Single-Shot Multibox Detector for the Identification of Early Cancerous Lesions in the Esophagus: A Pilot Study. Cancers (Basel). 2021;13.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 10]  [Cited by in F6Publishing: 26]  [Article Influence: 8.7]  [Reference Citation Analysis (0)]
49.  Gong EJ, Bang CS, Jung K, Kim SJ, Kim JW, Seo SI, Lee U, Maeng YB, Lee YJ, Lee JI, Baik GH, Lee JJ. Deep-Learning for the Diagnosis of Esophageal Cancers and Precursor Lesions in Endoscopic Images: A Model Establishment and Nationwide Multicenter Performance Verification Study. J Pers Med. 2022;12.  [PubMed]  [DOI]  [Cited in This Article: ]  [Reference Citation Analysis (0)]
50.  Zhao Z, Li M, Liu P, Yu J, Zhao H. Efficacy of Digestive Endoscope Based on Artificial Intelligence System in Diagnosing Early Esophageal Carcinoma. Comput Math Methods Med. 2022;2022:9018939.  [PubMed]  [DOI]  [Cited in This Article: ]  [Reference Citation Analysis (0)]
51.  Fukuda H, Ishihara R, Kato Y, Matsunaga T, Nishida T, Yamada T, Ogiyama H, Horie M, Kinoshita K, Tada T. Comparison of performances of artificial intelligence versus expert endoscopists for real-time assisted diagnosis of esophageal squamous cell carcinoma (with video). Gastrointest Endosc. 2020;92:848-855.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 45]  [Cited by in F6Publishing: 52]  [Article Influence: 13.0]  [Reference Citation Analysis (0)]
52.  Struyvenberg MR, de Groof AJ, van der Putten J, van der Sommen F, Baldaque-Silva F, Omae M, Pouw R, Bisschops R, Vieth M, Schoon EJ, Curvers WL, de With PH, Bergman JJ. A computer-assisted algorithm for narrow-band imaging-based tissue characterization in Barrett's esophagus. Gastrointest Endosc. 2021;93:89-98.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 27]  [Cited by in F6Publishing: 40]  [Article Influence: 13.3]  [Reference Citation Analysis (0)]
53.  Waki K, Ishihara R, Kato Y, Shoji A, Inoue T, Matsueda K, Miyake M, Shimamoto Y, Fukuda H, Matsuura N, Ono Y, Yao K, Hashimoto S, Terai S, Ohmori M, Tanaka K, Kato M, Shono T, Miyamoto H, Tanaka Y, Tada T. Usefulness of an artificial intelligence system for the detection of esophageal squamous cell carcinoma evaluated with videos simulating overlooking situation. Dig Endosc. 2021;33:1101-1109.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 12]  [Cited by in F6Publishing: 12]  [Article Influence: 4.0]  [Reference Citation Analysis (0)]
54.  Shiroma S, Yoshio T, Kato Y, Horie Y, Namikawa K, Tokai Y, Yoshimizu S, Yoshizawa N, Horiuchi Y, Ishiyama A, Hirasawa T, Tsuchida T, Akazawa N, Akiyama J, Tada T, Fujisaki J. Ability of artificial intelligence to detect T1 esophageal squamous cell carcinoma from endoscopic videos and the effects of real-time assistance. Sci Rep. 2021;11:7759.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 6]  [Cited by in F6Publishing: 6]  [Article Influence: 2.0]  [Reference Citation Analysis (0)]
55.  Yuan XL, Guo LJ, Liu W, Zeng XH, Mou Y, Bai S, Pan ZG, Zhang T, Pu WF, Wen C, Wang J, Zhou ZD, Feng J, Hu B. Artificial intelligence for detecting superficial esophageal squamous cell carcinoma under multiple endoscopic imaging modalities: A multicenter study. J Gastroenterol Hepatol. 2022;37:169-178.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 6]  [Cited by in F6Publishing: 7]  [Article Influence: 3.5]  [Reference Citation Analysis (0)]
56.  Tajiri A, Ishihara R, Kato Y, Inoue T, Matsueda K, Miyake M, Waki K, Shimamoto Y, Fukuda H, Matsuura N, Egawa S, Yamaguchi S, Ogiyama H, Ogiso K, Nishida T, Aoi K, Tada T. Utility of an artificial intelligence system for classification of esophageal lesions when simulating its clinical use. Sci Rep. 2022;12:6677.  [PubMed]  [DOI]  [Cited in This Article: ]  [Reference Citation Analysis (0)]
57.  Tokai Y, Yoshio T, Aoyama K, Horie Y, Yoshimizu S, Horiuchi Y, Ishiyama A, Tsuchida T, Hirasawa T, Sakakibara Y, Yamada T, Yamaguchi S, Fujisaki J, Tada T. Application of artificial intelligence using convolutional neural networks in determining the invasion depth of esophageal squamous cell carcinoma. Esophagus. 2020;17:250-256.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 64]  [Cited by in F6Publishing: 63]  [Article Influence: 15.8]  [Reference Citation Analysis (0)]
58.  Chen R, Liu Y, Song G, Li B, Zhao D, Hua Z, Wang X, Li J, Hao C, Zhang L, Liu S, Wang J, Zhou J, Zhang Y, Li Y, Feng X, Li L, Dong Z, Wei W, Wang G. Effectiveness of one-time endoscopic screening programme in prevention of upper gastrointestinal cancer in China: a multicentre population-based cohort study. Gut. 2021;70:251-260.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 24]  [Cited by in F6Publishing: 62]  [Article Influence: 20.7]  [Reference Citation Analysis (0)]
59.  Mehrer J, Spoerer CJ, Jones EC, Kriegeskorte N, Kietzmann TC. An ecologically motivated image dataset for deep learning yields better models of human vision. Proc Natl Acad Sci U S A. 2021;118.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 68]  [Cited by in F6Publishing: 36]  [Article Influence: 12.0]  [Reference Citation Analysis (0)]
60.  Lui TKL, Tsui VWM, Leung WK. Accuracy of artificial intelligence-assisted detection of upper GI lesions: a systematic review and meta-analysis. Gastrointest Endosc. 2020;92:821-830.e9.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 47]  [Cited by in F6Publishing: 45]  [Article Influence: 11.3]  [Reference Citation Analysis (0)]
61.  Kubo K, Fujino MA. Ultra-high magnification endoscopy of the normal esophageal mucosa. Gastrointest Endosc. 1997;46:96-97.  [PubMed]  [DOI]  [Cited in This Article: ]
62.  Oyama T, Inoue H, Arima M, Momma K, Omori T, Ishihara R, Hirasawa D, Takeuchi M, Tomori A, Goda K. Prediction of the invasion depth of superficial squamous cell carcinoma based on microvessel morphology: magnifying endoscopic classification of the Japan Esophageal Society. Esophagus. 2017;14:105-112.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 159]  [Cited by in F6Publishing: 189]  [Article Influence: 27.0]  [Reference Citation Analysis (0)]
63.  Ding Z, Shi H, Zhang H, Meng L, Fan M, Han C, Zhang K, Ming F, Xie X, Liu H, Liu J, Lin R, Hou X. Gastroenterologist-Level Identification of Small-Bowel Diseases and Normal Variants by Capsule Endoscopy Using a Deep-Learning Model. Gastroenterology. 2019;157:1044-1054.e5.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 152]  [Cited by in F6Publishing: 170]  [Article Influence: 34.0]  [Reference Citation Analysis (0)]
64.  Shichijo S, Nomura S, Aoyama K, Nishikawa Y, Miura M, Shinagawa T, Takiyama H, Tanimoto T, Ishihara S, Matsuo K, Tada T. Application of Convolutional Neural Networks in the Diagnosis of Helicobacter pylori Infection Based on Endoscopic Images. EBioMedicine. 2017;25:106-111.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 157]  [Cited by in F6Publishing: 163]  [Article Influence: 23.3]  [Reference Citation Analysis (0)]
65.  Inoue H, Kazawa T, Sato Y, Satodate H, Sasajima K, Kudo SE, Shiokawa A. In vivo observation of living cancer cells in the esophagus, stomach, and colon using catheter-type contact endoscope, "Endo-Cytoscopy system". Gastrointest Endosc Clin N Am. 2004;14:589-594, x.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 72]  [Cited by in F6Publishing: 67]  [Article Influence: 3.4]  [Reference Citation Analysis (0)]
66.  Muto M, Minashi K, Yano T, Saito Y, Oda I, Nonaka S, Omori T, Sugiura H, Goda K, Kaise M, Inoue H, Ishikawa H, Ochiai A, Shimoda T, Watanabe H, Tajiri H, Saito D. Early detection of superficial squamous cell carcinoma in the head and neck region and esophagus by narrow band imaging: a multicenter randomized controlled trial. J Clin Oncol. 2010;28:1566-1572.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 427]  [Cited by in F6Publishing: 482]  [Article Influence: 34.4]  [Reference Citation Analysis (0)]
67.  Hölscher AH, Bollschweiler E, Schröder W, Metzger R, Gutschow C, Drebber U. Prognostic impact of upper, middle, and lower third mucosal or submucosal infiltration in early esophageal cancer. Ann Surg. 2011;254:802-7; discussion 807.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 96]  [Cited by in F6Publishing: 74]  [Article Influence: 6.2]  [Reference Citation Analysis (0)]
68.  Yamashina T, Ishihara R, Nagai K, Matsuura N, Matsui F, Ito T, Fujii M, Yamamoto S, Hanaoka N, Takeuchi Y, Higashino K, Uedo N, Iishi H. Long-term outcome and metastatic risk after endoscopic resection of superficial esophageal squamous cell carcinoma. Am J Gastroenterol. 2013;108:544-551.  [PubMed]  [DOI]  [Cited in This Article: ]  [Cited by in Crossref: 166]  [Cited by in F6Publishing: 202]  [Article Influence: 18.4]  [Reference Citation Analysis (0)]