Review Open Access
Copyright ©The Author(s) 2024. Published by Baishideng Publishing Group Inc. All rights reserved.
Artif Intell Gastrointest Endosc. Jun 8, 2024; 5(2): 90704
Published online Jun 8, 2024. doi: 10.37126/aige.v5.i2.90704
Impact of artificial intelligence in the management of esophageal, gastric and colorectal malignancies
Ayrton Bangolo, Nikita Wadhwani, Vignesh K Nagesh, Shraboni Dey, Hadrian Hoang-Vu Tran, Izage Kianifar Aguilar, Aman Sidiqui, Aiswarya Menon, James Liu, Blessy George, Flor Furman, Nareeman Khan, Adewale Plumptre, Imranjot Sekhon, Simcha Weissman, Department of Internal Medicine, Palisades Medical Center, North Bergen, NJ 07047, United States
Auda Auda, Deborah Daoud, Sai Priyanka Pulipaka, Abraham Lo, Department of Medicine, Palisades Medical Center, North Bergen, NJ 07047, United States
ORCID number: Ayrton Bangolo (0000-0002-2133-2480); Simcha Weissman (0000-0002-0796-6217).
Author contributions: Bangolo A searched the literature, wrote, and revised the manuscript; Wadhwani N, Nagesh VK, Dey S, Tran H, Aguilar IK, Auda A, Sidiqui A, Menon A, Daoud D, Liu J, Pulipaka P, George B, Furman F, Khan N, Plumptre A, and Sekhon I wrote, revised and edited the manuscript; Weissman S and Lo A wrote, revised and approved the final version and are the article’s guarantors; All authors certify that they contributed sufficiently to the intellectual content and data analysis; Each author has reviewed the final version of the manuscript and approves it for publication.
Conflict-of-interest statement: No potential conflict of interest was reported by the authors.
Open-Access: This article is an open-access article that was selected by an in-house editor and fully peer-reviewed by external reviewers. It is distributed in accordance with the Creative Commons Attribution NonCommercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See:
Corresponding author: Ayrton Bangolo, MBBS, MD, Doctor, Department of Internal Medicine, Palisades Medical Center, 7600 River Road, North Bergen, NJ 07047, United States.
Received: December 12, 2023
Revised: January 28, 2024
Accepted: March 4, 2024
Published online: June 8, 2024


The incidence of gastrointestinal malignancies has increased over the past decade at an alarming rate. Colorectal and gastric cancers are the third and fifth most commonly diagnosed cancers worldwide but are cited as the second and third leading causes of mortality. Early institution of appropriate therapy from timely diagnosis can optimize patient outcomes. Artificial intelligence (AI)-assisted diagnostic, prognostic, and therapeutic tools can assist in expeditious diagnosis, treatment planning/response prediction, and post-surgical prognostication. AI can intercept neoplastic lesions in their primordial stages, accurately flag suspicious and/or inconspicuous lesions with greater accuracy on radiologic, histopathological, and/or endoscopic analyses, and eliminate over-dependence on clinicians. AI-based models have shown to be on par, and sometimes even outperformed experienced gastroenterologists and radiologists. Convolutional neural networks (state-of-the-art deep learning models) are powerful computational models, invaluable to the field of precision oncology. These models not only reliably classify images, but also accurately predict response to chemotherapy, tumor recurrence, metastasis, and survival rates post-treatment. In this systematic review, we analyze the available evidence about the diagnostic, prognostic, and therapeutic utility of artificial intelligence in gastrointestinal oncology.

Key Words: Artificial intelligence, Gastrointestinal malignancies, Machine learning, Helicobacter pylori, State-of-the-art deep learning models

Core Tip: Application of artificial intelligence in the realm of gastrointestinal malignancies has burgeoned over the past decade as its incorporation has streamlined the work-up of gastrointestinal malignancies to address the alarming mortality statistics, largely resulting from delayed interception. The latter juxtaposed with the abundant array of contemporary diagnostic, predictive, and prognostic tools, is a testament to their underperforming status and calls for the development of digital tools that can optimize the oncologic work-up and pave the way for personalized therapies.


Artificial intelligence (AI) is a broad term encompassing the utilization of computer systems to perform tasks that traditionally require human intelligence. This encompasses various fields, including machine learning (ML), which itself includes specialized areas such as deep learning (DL). ML involves training machines to recognize patterns within input data to make predictions in clinical settings. This process relies on extensive datasets to identify correlations between different variables and develop predictive models that can be applied to new data. The accuracy of ML algorithms is directly influenced by the volume of data fed into the model. DL, on the other hand, is rooted in complex neural networks with multiple layers that learn from extensive datasets. These models continuously improve through repeated iterations[1]. The rapid advancement of AI algorithms in healthcare is attributed to the abundance of electronic health records, genomic information, biomedical research, and the integration of telemedicine. This progression has significantly transformed the healthcare sector into a rich source of data[2]. Gastrointestinal (GI) cancer is one of the most prevalent malignancies in the world with over 2.8 million deaths annually[3]. Recent advancements in the diagnosis and treatment of GI tumors have led to improved clinical outcomes. Nevertheless, the amount of data on these malignancies is beyond the interpretation of human intelligence. Gastroenterologists rely heavily on radiological and endoscopic imaging. AI has shown promise in expeditious cancer diagnosis, reduction of misdiagnosis, and interobserver variability in visual classification, and radiologic/histopathologic interpretation[4]. In this review, we explore the utility of AI models in the diagnosis, treatment, and prognostication of GI malignancies (Table 1).

Table 1 Application of artificial intelligence in gastrointestinal malignancies.
Esophageal cancerLesion characterization (Barrett's esophagus vs dysplasia vs ESCC[7-9]
Predicting histological response to therapy[30-32]
Prognostic assessment[33-37]
Gastric cancerEndoscopic detection of early GC[44-46]
Differentiation between stage IV GC and PGL[48,49]
Accurate TNM staging[54-57]
Colon cancerHigher adenoma detection, reduced neoplasia“miss rate"[72-74]
Superior LNM estimation[81-85]
Intraoperative guidance during colorectal resection[92-95]

Esophageal cancer (EC) is the seventh most commonly diagnosed cancer and sixth leading cause of cancer mortality worldwide[3]. Esophageal squamous cell carcinoma (ESCC) and esophageal adenocarcinoma (EAC) are the two primary histological subtypes of EC, with the latter being more prevalent in the US. Upper endoscopy or, in the cases of metastases, image-guided biopsy of a metastatic site may be used to obtain a diagnostic biopsy. Endoscopically, superficial plaques, nodules, or ulcerations represent early EC. Endoscopic biopsy using tissue sampling is the gold standard for EC diagnosis. A single biopsy has a 93% diagnostic accuracy, and multiple biopsies improve the yield to over 98%. Nonetheless, about 6.5% of cases of EC are missed during diagnostic endoscopy[5].

A mere 20% average five-year overall survival (OS) rate is indicative of the need for improvement in currently available EC diagnostics. AI-driven imaging analysis plays a pivotal role in early detection and diagnosis of EC. DL algorithms can analyze endoscopic images and radiological scans to identify suspicious lesions or abnormalities. AI models have been shown to interpret these images with phenomenal precision, distinguishing between normal cells, malformations, and malignant growth[6]. These techniques enable accurate detection that is difficult to obtain through human interpretation. AI accelerates diagnosis by simplifying the analysis of medical images, guaranteeing that even slight or minor anomalies are not overlooked. This speeds up the start of suitable treatment, which may improve outcomes for patients. ESCC diagnosis is primarily contingent on the endoscopist’s level of expertise, thereby limiting its detection at early stages. Various AI techniques that analyze endoscopic, radiologic, clinicopathologic, and genetic factors have been developed to improve the unfavorable prognosis associated with this condition. The development of AI-based EC diagnostics is geared toward the identification of high-grade dysplasia, or early-stage ESCC and EAC (T1 cancers) originating from Barrett's esophagus (BE)[7]. A recent meta-analysis looking at the efficacy of AI in detecting ESCC showed that AI increases the detection of early-stage EC and precisely evaluates tumor depth invasion compared to endoscopists alone[8]. AI has shown efficacy in the detection of esophageal metaplasia and neoplasia in BE. It is not unusual for endoscopists to miss neoplastic lesions in the esophagus. Computer-aided detection (CADe) systems have an accuracy rate of 89% in identifying early neoplasia in people diagnosed with BE and have outperformed endoscopists in some studies[9]. Additionally, a meta-analysis by Islam et al[10] analyzing over 700000 images from 28 studies demonstrated that DL-based diagnosis of EC had a pooled accuracy, sensitivity, and specificity of 92.90%, 93.80%, and 91.73% respectively.

Gastrointestinal artificial intelligence diagnostic system uses DL to diagnose upper GI malignancies through analysis of imaging data from clinical endoscopies with high diagnostic accuracy and sensitivity, similar to that of expert endoscopists, and superior to that of non-expert endoscopists[11]. CADe using deep neural networks (DNNs) for screening of early ESCCs has also demonstrated high accuracy and sensitivity and can assist endoscopists in detecting lesions previously ignored under white-light imaging[12].

AI-based convolutional neural networks (CNNs) are models made of convolutional filters whose primary function is to learn and extract necessary features for efficient medical image understanding. There is evidence that CNNs can detect early neoplasia with a sensitivity of 96.4%, a specificity of 94.2%, and an accuracy of 95.4%. The object detection algorithm with AI draws a localization box around the areas of dysplasia with high precision and at a speed that allows for real-time implementation[13,14]. Another form of CNN called the single shot multi-box, which detects objects in images using a single DNN, was used in a pilot study and it showed a sensitivity, specificity, and diagnostic accuracy of 96.2%, 70.4%, and 90.9% respectively in EC detection[15]. CNN models have been used to analyze computed tomography (CT) images for EC detection with a diagnostic accuracy of 84.2%, sensitivity of 71.7%, and specificity of 90.0%[16].

A study employing a fully convolutional network, a technique that can obtain segmentation using only CNN to enable end-to-end learning, also showed promising results. The mean diagnostic accuracy of the model was 89.2% and 93% at the lesion and pixel levels, respectively. The interobserver agreement between the model and the gold standard was 0.72. The accuracy of the model for inflammatory lesions (92.5%) was superior to that of the mid-level (88.1%) and junior (86.3%) groups (P < 0.001). For malignant lesions, the accuracy of the model (B1, 87.6%; B2, 93.9%) was significantly higher than that of the mid-level (B1, 79.1%; B2, 90.0%) and junior (B1, 69.2%; B2, 79.3%) groups (P < 0.001)[17].

A new form of diagnosis called the volumetric laser endomicroscopy (VLE) which is a new promising imaging tool for finding dysplasia in BE at an early stage, by acquiring cross-sectional images of the microscopic structure of BE up to 3-mm deep. Unfortunately for clinicians, interpretation is complex due to both the size and subtlety of the gray-scale data. Studies for development of CADe algorithms, using deep learning strategies by encoding data from VLE, are underway[18].

The utilization of magnifying electronic chromoendoscopy with narrow band imaging or blue light imaging (ME-NBI/BLI) enhances the detection of subtle changes in intramucosal capillary loops, crucial for early ESCC diagnosis. However, its effectiveness heavily relies on the operator's expertise and necessitates an objective and precise evaluation method. A study employing AI-assisted ME-NBI in ESCC diagnosis demonstrated an average diagnostic accuracy of 93.0% at the image level. The model's diagnostic accuracy for inflammatory lesions (92.5%) surpassed that of intermediate-level (88.1%) and primary-level (86.3%) endoscopists. Moreover, the model's diagnostic accuracy for malignant lesions (B1, 87.6%; B2, 93.9%) significantly exceeded that of the intermediate (B1, 79.1%; B2, 90.0%) and junior (B1, 69.2%; B2, 79.3%) endoscopist groups[19]. Recent evidence has reiterated that AI models confer higher diagnostic accuracy compared with stand-alone endoscopist detection of early-stage EC. Studies have demonstrated comparable and often superior efficacy of AI vs clinician-based diagnosis[20]. A recent study employing a robustly trained and validated AI model in the identification of early ESCC found that the newly developed computational algorithm had significant per-patient accuracy, with sensitivity, and specificity of 99.5% and 100% respectively, for white light imaging and 97.0%, 97.2%, and 96.4%, for optical enhancement/iodine staining using non-magnifying pictures. The model had 88.1% per-patient accuracy for diagnosing magnified pictures, outperforming beginner endoscopists[21]. A recent meta-analysis by Zhang et al[22] looking at the accuracy of AI reiterated that detection solely reliant on endoscopists yielded inferior sensitivity (85%) compared with AI-assisted (94%) diagnosis.

Circulating tumor cells (CTC) have predictive potential to estimate prognosis, distant metastases, and response to treatment. CNNs are highly accurate and efficacious tools for CTC detection, successfully identifying different KYSE cell lines with minimal variance in epithelial cell adhesion molecule expression. Akashi et al[23] demonstrated that AI outperformed conventional diagnostics by correctly recognizing cancer from non-cancer cells (accuracy 100% vs 91.8%) with phenomenal processing times (0.74 sec for AI vs 630.4 sec for researchers; P = 0.012).

Accurate staging of EC is critical for identifying the best treatment option. Based on medical and radiological information, AI offers ML simulations that assist in tumor staging. These simulations take into account many variables, including but not limited to the location and size of the tumor and involvement of lymph nodes. Zhang et al[24] employed an AI-based CADe to detect pre-operative lymph node metastases (LNM) in patients with ESCC. AI-CADe model demonstrated an accuracy of 74.4% in predicting LNM, and markedly augmented the clinicians’ diagnostic performance based on positron emission tomography-computed tomography (PET-CT) imaging analyses (accuracy 83.3% vs 71.2%; specificity 89.1% vs 69.7%) amongst patients in the validation cohort. Radiomics and DL models have been shown to yield reliable risk estimates for LNM in ESCC patients. Tan et al[25] developed a radiomics signature consisting of five features that were significantly associated with presence of LNM in resectable ESCC cases, with an area under the curve (AUC) of 0.758 and 0.773 in the training and validation sets respectively. Interestingly, the discriminatory power of the radiomics nomogram exceeded that of size criteria in both cohorts. Wu et al[26] devised a multiple-level CT radiomics model with a demonstrable efficacy in predicting preoperative LNM in ESCC patients. The model integrated numerous radiomics signatures into clinical risk factors and exhibited a robust discrimination performance with a C-statistic of 0.840 in an independent external validation cohort.

EC is a challenging entity to manage with significant global impact. Treatment outcomes greatly depend on early and accurate diagnosis. Incorporation of AI-based diagnostic algorithms is paramount to circumvent the existential shortcomings in this space. AI-powered staging tools aid in the optimization of the treatment of patients, resulting in customized and efficacious therapies[27].


EC exhibits significant heterogeneity, and treatment responses may vary among patients. AI-powered precision medicine has the potential to tailor therapies based on individual patient characteristics, optimizing treatment efficacy and minimizing adverse effects. This section explores the evolving role of AI in the treatment of EC, highlighting its applications in treatment planning, precision medicine, treatment response prediction, and post-surgical prognosis prediction.

By integrating genomic data, medical histories, and treatment outcomes, AI algorithms can identify predictive biomarkers and potential drug targets. Li et al[28] have discussed the role of AI in single-cell sequencing and spatial transcriptomics, which are crucial for understanding cell compositions and novel cell types in EC. AI algorithms can analyze patient-specific imaging data, allowing for improved target delineation and more precise radiation planning[29]. Such advancements improve patients’ quality of life and treatment compliance.

Adequate treatment response impacts further therapeutic decisions. Several investigators have trained AI models to predict histologic response to therapy. Li et al[30] conducted a multicenter study to assess the therapeutic efficacy of chemoradiotherapy in patients with locally advanced ESCC using a 3D-radiomics model based on pretreatment CT images. Response to treatment was predicted with a postiive predictive value of 100% and AUC of 0.8 in radiation therapy plan, radiation field and prescription dose utilized. Ypsilantis et al[31] trained a CNN using pre-treatment PET-CT images, to predict response to neoadjuvant chemotherapy, and were able to attain accuracy of 73.4%, with sensitivity and specificity of 80.7% and 81.6% respectively. Warnecke-Eberz et al[32] developed an artificial neural network (ANN) system to analyze 17 genes with TaqMan low-density arrays in treatment of naive patients. Histologic response to neoadjuvant chemoradiotherapy was predicted with an accuracy of 78.1%, sensitivity and specificity of 75.0% and 81.0% respectively. This highlights that AI can play a crucial role in determining response to treatment modalities, which may greatly influence the care of EC patients.

Prognosis for patients with EC varies considerably, and several patients benefit from surgical resection if amenable. AI models can be used prognostically to guide post-operative treatment decisions. Xu et al[33] constructed 5 ML models for survival risk stratification in 810 patients who underwent surgery for ESCC. Of the five models, the XGBoost model demonstrated optimal performance with an AUC of 0.85. This model accurately predicted 5-year OS estimates in both training and validation cohorts across all three (low, moderate and high-risk) categories. PET scans are routinely employed in functional assessment of ESCC patients. Studies have trained 3D-CNN models using PET scan imagery to identify tumors with aggressive histopathological features and survival estimates in this group. Presence of variables like lymphovascular invasion (LVI) or perineural invasion (PNI) portend unfavorable prognosis[34,35]. Yeh et al[36] trained a 3D-CNN model using PET imaging to predict LVI/PNI, and obtained promising results. Yang et al[37] developed a deep-CNN model with 798 PET scans of ESCC to predict outcomes in this cohort, and notably obtained 1 and 5-year survival estimates with remarkable accuracy.


AI has demonstrated immense potential in transforming the landscape of EC treatment. AI can assist with treatment planning in radiation therapy by optimizing radiation dose distribution and minimizing damage to surrounding healthy tissues. AI-assisted therapeutic regimen guarantees that radiation dosage is correctly given to the tumor while shielding adjacent normal tissues as this framework considers the shape and location of the tumor and its relation to vital components[29]. AI models have the potential to predict histologic response to chemoradiotherapy and can also yield reliable survival estimates for patients undergoing surgical resection. This information could enable accurate prognostication and serve as a guiding tool for clinicians and patients during the decision-making process[33-35].


The successful integration of AI into clinical practice necessitates rigorous validation, collaboration between AI developers and medical experts, and adherence to ethical standards. AI can be pivotal in ushering in an era of personalized and targeted therapies for EC patients based on specific tumor heterogeneity. The gap between AI model development and clinical implementation needs to be closed. Educating clinicians and fostering research in this arena can augment the incorporation of AI models into clinical practice, which could prove instrumental in optimizing survival outcomes.


Gastric cancer (GC) is the fifth most common cancer and the third most common cause of cancer death globally. It is diagnosed histologically after endoscopic biopsy and staged using CT, endoscopic ultrasound, PET, and laparoscopy[38,39]. Most cases are diagnosed at an advanced stage when the prognosis is poor and treatment options are limited. Unfortunately, the existing circulating diagnostic and prognostic biomarkers have low sensitivity and specificity and hence the diagnosis is based only on endoscopy. Some recently discovered circulating molecules (miRNAs, lncRNAs, circRNA) hold promise for development of new strategies for early diagnosis of GC, and discrimination between early GC and healthy subjects, with a sensitivity of more than 77.5%[40].

GC develops through a series of precancerous stages, including helicobacter pylori (H. pylori)-induced atrophic gastritis (AG), gastric intestinal metaplasia (GIM), dysplasia, and finally GC. Identifying these stages is essential for early detection and treatment of GC. AI-assisted diagnosis of AG and GIM has achieved an accuracy of approximately 90%, which is more accurate than predicting H. pylori infection[41-43]. Sensitivities ranging from 92.2% to 94.5% have been observed in AI-assisted early GC detection. Furthermore, AI-assisted detection systems have proven effective in reducing blind spots during endoscopy and enhancing early GC detection rates among less experienced endoscopists[44].

Recognizing that the ability of endoscopists to identify gastric lesions can vary, an AI system was developed and trained on 29809 images from 8947 patients, resulting in an accuracy rate of 85.7% in diagnosing six common gastric lesions. When integrated into clinical practice, the AI system notably enhanced the overall accuracy of both senior and junior endoscopists to 89.3% and 86.2%, respectively[45]. Goto et al[46] showed that AI-enabled assessment of invasion depth for early GC was superior compared to stand-alone endoscopist assessment (F1 measure - 0.78 vs 0.66). However, contemporary literature indicating comparable diagnostic efficacy of computer-aided models and experts for EGC exists, thereby warranting more well-powered prospective studies[47].

Primary gastric lymphoma (PGL) is a commonly diagnosed gastric malignancy that is difficult to differentiate from Borrmann type IV GC using conventional CT. The management for these two conditions varies significantly, with surgery being the primary treatment for GC, whereas PGL is best managed with chemotherapy and radiotherapy. A radiomics model demonstrated effective differentiation between Borrmann type IV gastric cancer and PGL with an AUC of 0.90[48], whereas a transfer learning model utilizing CT and whole slide images achieved high accuracy in distinguishing PGL from Borrmann type IV GC, with AUCs ranging from 0.92 to 0.99[49].

The prognosis of gastric adenocarcinoma (GA) is closely linked to its stage determined at diagnosis. GA is typically categorized into two groups: early GA (EGA) and advanced GA (AGA). EGA, involving the mucosa and submucosa (T1), boasts an impressive 5-year survival rate exceeding 90%. In contrast, AGA, with deeper infiltration (T2-T4), exhibits a 5-year survival rate ranging from 7% to 27%[50]. Clinical staging of GC is essential for determining the most appropriate treatment strategy. Endoscopic ultrasound (EUS) is used to assess tumor depth (T category) and nodal involvement (N category). However, its diagnostic accuracy can range from 57% to 88% for T staging and 30% to 90% for N staging. CT scanning is routinely employed for preoperative staging but has an overall accuracy ranging from 43% to 82% in measuring the depth of tumor invasion. (18)F-Fluorodeoxyglucose (FDG)-PET/CT imaging offers advantages over FDG-PET or CT alone, with a significantly higher accuracy rate in preoperative staging (68%) compared to FDG-PET (47%) or CT (53%) alone. In regions without screening programs for early detection, about 50% of patients present with advanced disease at diagnosis, which is associated with a poorer prognosis. Furthermore, the presence of metastases, poor performance status, and an alkaline phosphatase level ≥ 100 U/L are predictive of a less favorable outcome. The number of positive lymph nodes also significantly influences survival in patients with localized resectable disease[51].

Conventional CT-based GC staging is often erroneous, with accuracies ranging from 51.6% to 91.5%. For T-staging, distinguishing between different T stages, especially T1 and T2, a DL model namely Residual Neural Network, i.e. ResNet101, has demonstrably improved the accuracy to 91.4%-94.6%. However, AI models have not yet been implemented in clinical practice compared to EUS, which remains the preferred method. DL radiomics models have shown promise in serosal invasion assessment for T4a staging, achieving AUCs ranging from 0.76 to 0.90, but further clinical validation is needed. In addition, these models have enhanced detection rates for peritoneal metastasis and occult peritoneal masses (OPM) on CT scans, with AUC values ranging from 0.72 to 0.90, thereby outperforming conventional clinical models[52]. However, these models require manual segmentation of tumors[53].

Accurate assessment of tumor invasion depth is a critical factor in the diagnosis and treatment of GC. Traditional visual examination of endoscopic images is replete with challenges due to subtle morphological differences and subjective judgments, often requiring invasive histopathological examinations for definitive diagnosis. Kubota et al[54] applied the back-propagation algorithm to train a multilayer perception model, achieving acceptable diagnostic accuracies for different T stages of GC; 77.2% for T1, 49.1% for T2, 51% for T3, and 55.3% for T4. This AI-based approach demonstrated similar performance to expert endoscopists, particularly in T1a and T1b staging. Another study using ResNet to classify lesions as P0 (T1a or T1b) and P1 (deeper than T1b), achieved an AUC score of 0.94, surpassing the sensitivity and specificity of endoscopists[55]. Yoon et al[56] applied the Visual Geometry Group (VGG) model, a type of deep CNN model, to classify T1a and T1b-EGC, that yielded an AUC of 0.85, with incorrect predictions associated with undifferentiated-type histology and T1b staging. Furthermore, Nagao et al[57] extended AI-based depth prediction to various endoscopic image modalities, including narrow band imaging (NBI) and Indigo–carmine dye contrast imaging, achieving high AUC scores of 94.5%, 94.3%, and 95.5% for white light imaging, NBI, and Indigo-carmine dye images, respectively.


Chemotherapy along with endoscopic or surgical resection is indicated for locally advanced GC. However, both neoadjuvant and adjuvant chemotherapy confer varying results in terms of their efficacy and survival benefit in these patients. Ensuring proper selection of chemotherapy agents and patient populations is important to improve outcomes and reduce adverse events from chemotherapy. AI-based models can be used to accurately predict treatment response in GC.

A DL radiomics model, adapted from multiphasic contrast-enhanced CT imaging, was developed by Li et al[58] to predict the profile of patients with signet ring cell carcinoma (SRCC) likely to benefit from postoperative chemotherapy. The model was applied to pathologically confirmed, advanced SRCC patients and it could accurately differentiate between patients with a high likelihood of chemotherapy resistance and shorter OS (without vs with chemotherapy median OS 31.0 vs 54.4 months, P = 0.036) vs those with low risk of resistance and substantial chemotherapy benefit (without vs with chemotherapy median OS 26.0 vs not reached, P = 0.013).

AI can also be used to find new biomarkers that would correlate with increased response to certain chemotherapeutics. Sundar et al[59] built a random forest machine learning model to analyze surgical samples and results from the Stomach cancer Adjuvant Multi-Institutional group Trial, a 2 × 2 factorial randomized phase III trial testing the efficacy of paclitaxel (Pac) as an additional chemotherapy agent to tegafur-gimeracil-oteracil (S-1) and tegafur-uracil (UFT). Out of 476 initial genes in their panel, the AI model was trained with the top 19 genes and applied to the Pac S-1 and PAC UFT cohorts. Those with low gene expression were shown to have increased survival benefit with paclitaxel compared to those with high gene expression in both cohorts.

Several ML models have shown promise in predicting GC prognosis including survival and risk of recurrence, by combining multiple factors. The Logistic Regression, Support Vector Machine, and Extreme Gradient Boosted Decision Tree models showed remarkable accuracy (89%) and AUC (0.87), outperforming a currently used risk assessment tool named ogRAT, i.e., oesophagogastric cancer risk-assessment tool, and also had a higher sensitivity for cancer diagnosis compared with ogRAT[60].

Several studies have utilized ANNs in their risk prediction models. Que et al[61] predicted the 3-year OS using a preoperative ANN that showed an accuracy, sensitivity, and specificity of 75.2%, 86.5%, and 43.8% respectively. Another study showed that ANN-based survival prediction (AUC 0.81) was comparable with actual 5-year survival estimates, and survival outcomes were accurately classified with ANN when compared with the AJCC staging system[62]. Based on these findings, prognostication can be performed with an acceptable degree of certainty. Notably, these studies evaluated clinicopathological parameters like tumor diameter and TNM factors. Alternative stratification to the TNM classification to predict outcomes was not explored in these models.

Zhang et al[63] utilized CT imaging to train a DL model for GC prognostication with acceptable accuracy with C-indexes of 0.82 and 0.78 in the training and test datasets respectively. Jiang et al[64] developed a multitask DL model to predict peritoneal recurrence (PR) after curative intent surgery in GC patients, and showed that adjuvant chemotherapy improved disease-free survival (DFS) only for patients with a predicted high risk of PR and low survival, as opposed to patients with a low predicted PR risk and high survival. Similarly, Sun et al[65] trained a radiomics model using CT images to predict the likelihood of PR and benefit from chemotherapy in patients undergoing radical gastrectomy. The radiomics signature accurately predicted PR with AUCs of 0.73 and 0.72 in the training and validation cohort respectively. It significantly improved the diagnostic accuracy of PR by 10%-19% for clinicians (P < 0.001). Also, it was an independent predictor of PR and survival. Patients who had high scores for PR benefitted from adjuvant chemotherapy, unlike patients with low scores who didn’t have any impact on their survival from chemotherapy. Prospective studies are required to validate such models to delineate the subset of high-risk patients that could potentially benefit from adjuvant therapies.

Cheong et al[66] developed an AI algorithm called NTriPath to identify prognostic molecular pathways in gastric cancer. The algorithm integrated pan-cancer somatic mutation data, gene-to-gene interaction networks, and pathway databases to identify prognostic cancer-associated molecular pathways. They then tested the clinical relevance of these subtypes and created a risk-scoring model to predict both OS and response to therapies including chemotherapy and immune checkpoint blockade. Kuwayama et al[67] evaluated GC prognosis using blood collection data, excluding clinicopathological features such as tumor depth and LNM, which are typically employed in the conventional TNM classification. These strategies were able to predict the 5-year OS and recurrence-free survival with acceptable accuracy. These advancements highlight the potential of AI in improving the prognosis and prediction of GC.


AI-assisted GC screening is a rapidly developing field with the potential to significantly reduce GC incidence and improve patient outcomes. AI-assisted endoscopic detection of GC has achieved comparable or higher performance than endoscopists, with an accuracy greater than 80%[41,42]. DL radiomics models can accurately distinguish between T stages, and also reliably predict PM and OPM[52]. AI-driven approaches offer promising solutions for accurate tumor invasion depth estimation in endoscopy, which is crucial for determining appropriate treatment strategies, such as endoscopic resection, and improving diagnostic accuracy[54-57]. AI models can also be used to predict treatment response and risk of recurrence in GC patients. This information could be used to develop customized treatment plans based on specific patient and tumor profiles[61-66].


Challenges such as data scarcity, poor interpretability, and generalization persist. AI models are often convoluted and lack transparency. Also, DL radiomics models used for GC staging require manual segmentation of tumors, which is time-consuming and has not been validated in many external validation cohorts. This remains a persistent challenge across TNM staging studies, limiting their practical application in clinical practice[53]. Unified processing procedures, data regulation, and advanced algorithms are needed to build more accurate and robust AI models.


Colorectal cancer (CRC) is the third most common cancer and the second leading cause of death worldwide. AI models have demonstrated potential in predicting the risk of CRC development in individuals. ML algorithms can analyze vast datasets, encompassing genetic, lifestyle, and medical history information, to identify patterns and risk factors associated with the disease. Such risk prediction models can aid clinicians in identifying high-risk patients who may benefit from increased surveillance or preventive measures[68].

Colonoscopy has always been regarded as the gold standard procedure for diagnosing CRC. Because of significant inter-operator variability and frequent inadequate preparation, the detection rate of polyps and adenomas during early CRC screening can fluctuate substantially. To address this issue, researchers have employed CADe systems utilizing DL algorithms to enhance the efficiency and precision of CRC clinical detection while minimizing the likelihood of missed lesions[69]. Certain adenomas and polyps identified through real-time CADe may be small and pose low risk, yet they are often overlooked and missed during conventional colonoscopy examinations. Missed colorectal neoplasia is associated with post-colonoscopy CRC development. AI-assisted colonoscopy can effectively improve the adenoma detection rate (ADR) and help reduce the incidence and mortality of CRC patients[70]. Additionally, AI-based colon capsule endoscopy increases the sensitivity of polyp detection through enhanced polyp visibility and better delineation of morphology and significantly reduces the screening time[71].

In a multicenter randomized control trial, overall ADR per colonoscopy was significantly higher in the AI-assisted colonoscopy group compared to the conventional colonoscopy group, for both advanced and non-advanced adenomas, for adenomas < 5 mm and ≥ 10 mm, for non-pedunculated adenoma, and in both proximal and distal colon. Consequently, there were more adenoma resections which may translate to improved long-term cancer prevention[72]. Similar conclusions were obtained in a prospective study, with the results showing that AI-assisted experienced endoscopists identify more adenomatous lesions that are otherwise difficult to identify[73].

CADe systems do not address exposure errors, but new computer-aided quality-assessment systems are being investigated. An AI-based system for measuring fold evaluation quality (FEQ) demonstrated a significant correlation with expert scores, historical ADR, and withdrawal time. For colonoscopies performed by colonoscopists with previously low ADRs (< 25%), AI assistance significantly improved the FEQ[74], and reduced the incidence of missed adenomas during colonoscopy as nearly 80% of these lesions that get overlooked by endoscopists, are detected by AI. AI serves as a complementary "eye" for the endoscopist, unaffected by distractions and fatigue. However, approximately 20% of missed adenomas are not even visible on the screen, often located behind folds, in challenging flexure positions, or concealed under fecal content in cases of poor bowel preparation[75].

Nodal metastases are invariably a key determinant of DFS and OS in patients with CRC[76]. It has a substantive impact on the staging process, consequentially dictating the implementation of appropriate adjuvant and neoadjuvant therapies. Staging is based upon contrasted CT imaging in CRC cases, with additional MR imaging in patients with rectal cancer. Staging accuracy is contingent on a multitude of factors, namely reporting radiologist’s expertise, equipment performance, employed imaging protocols, and patient-related factors. Current evidence suggests that CT and MR imaging have a diagnostic accuracy (for nodal metastases detection) measuring 70 and 69% respectively with the application of standard criteria[77,78]. Pre-operative MRI-based staging has mainstreamed neoadjuvant chemoradiation into locally advanced rectal cancer (LARC) management, and afforded superior blueprints for resectability, and enhanced locoregional control. Neoadjuvant therapy, however, is not commonplace in colon cancer management due to subpar nodal staging accuracy in this cohort[79]. Notably, the results of the FOXTROT trial demonstrating robust safety and efficacy estimates for neoadjuvant therapy could potentially alter the current paradigm[80]. Augmentation of nodal staging modalities is paramount to bring about this change, and AI is a powerful tool that can overcome the shortcomings of contemporary diagnostics.

Ito et al[81] demonstrated that CNN-based diagnosis of cT1bCRC yielded a sensitivity, specificity, accuracy, and AUC of 67.5%, 89.0%, 81.2%, and 0.871 respectively. Another study showed that AI-based determination of the need for surgery after endoscopic resection of T1CRC is more accurate compared with the current guidelines, thereby preventing unwarranted surgeries in this select group of patients. The model employed in this study, which took into account 45 clinicopathologic factors, yielded a sensitivity, specificity, and accuracy of 100%, 66%, and 69%, respectively[82].

MRI is the gold standard tool for detecting CRC LNM before surgery. Of note, studies utilizing faster region-based CNNs have outperformed senior radiologists in metastatic LN detection with MR imaging[83,84]. Lu et al[83] showed that their CNN-based model (AUC 0.91) clinched diagnosis in 20 s compared with radiologists who took 600 s. Another study looking at the utility of a LASSO-based ML algorithm incorporating histopathologic variables and tumor-infiltrating leukocytes outperformed conventional criteria in predicting LNM in T1CRC cases[85]. An AI-based model developed with 152 CT-based tumor variates and 6 clinical variates, to detect CRC hepatic metastases, enhanced the accuracy of both validation (85.5%) and training sets (90.63%) with an AUC of 0.87 and 0.96 respectively[86]. Additionally, highly accurate and expeditious DL models have been developed recently for CRC localization and segmentation on MR imaging, which need to be iterated and validated further with randomized controlled studies[87,88].

AI-based radiomics nomograms and DL algorithms have remarkably impacted the accuracy of radiologic diagnosis and staging. Radiomics models analyze a vast number of investigator-determined characteristics from a large database using advanced computational tools. AI models have attained partial success in the detection of nodal metastases thus far, but several studies have concluded that DL algorithms can successfully search through and filter out anomalies that are elusive to conventional staging diagnostics[89,90]. Further research to yield advanced iterations of these AI models can prove instrumental in obtaining robust diagnostic and staging estimates.


AI can integrate and analyze vast amounts of data generated during colon cancer diagnosis and treatment, and provide real-time decision support to clinicians. Newer technologies in colorectal surgery are using mathematical algorithms, 3D imaging, and AI to improve patient outcomes. 3D image processing and reconstruction are being used in colorectal surgical oncology, specifically focusing on complete mesocolic excision and D3-lymphadenectomy in colon cancer cases. The system effectively delineates the margins around the tumor, including the vascularization, and provides critical information for surgical planning[91]. Several AI models have been evaluated in the context of laparoscopic colorectal resections. Kitaguchi et al[92] constructed a multicentric dataset of 300 Laparoscopic colorectal surgery videos and were able to recognize phase, action, and tool with high accuracy. Overall accuracies for action task classification and surgical phases were 83.2% and 81% respectively, with a reasonable tool segmentation performance, as depicted by an intersection of union (IoU) value of 51.2%. Accuracy for the phase and action recognition was 81% and 87% for transection and anastomosis phases, respectively. Semantic segmentation of surgical instruments resulted in mean IoUs ranging from 33.6% to 68.9%. Kitaguchi et al[93] developed another AI model using 50 transanal total mesorectal excision videos that yielded an overall accuracy of 93.2% for phase recognition and 76.7% for combined phase and step recognition. Supervised ML models are being used during robotic rectal resection procedures. In a recent study, ML-based intraoperative guidance during total mesorectal excision (TME) yielded higher mean F1 scores for Gerota's fascia (0.78) and mesocolon (0.71), but lower scores for the dissection plane (0.32) and exact dissection line (0.05)[94]. Igaki et al[95] developed a DL model using 600 intraoperative TME images that could accurately delineate the oncologic dissection plane with a dice coefficient of 0.84.

Robotic surgery, novel computer-assisted drug delivery techniques, and ML models can potentially revolutionize CRC treatment and personalize cancer care. AI models have been developed to predict the half-maximal inhibitory concentration of anti-cancer drugs against the human colon carcinoma HCT116 cell line with an overall prediction accuracy of over 63%[96]. DNNs have also been used to develop anticancer drugs that inhibit PI3K alpha and tankyrase, with promising results[97]. Nanoparticles are being explored as pharmaceutical carriers for tumor targeting[98]. Computer-aided magnetotactic displacement techniques have been proposed to navigate and deliver drug-loaded magnetotactic bacteria MC-1 toward the hypoxic areas of tumors[99]. MRI-based AI models have also been developed to predict the pathological complete responder and non-responder patients with LARC after neoadjuvant chemoradiotherapy with an AUC of 0.86 and 0.83, respectively[100].

ML can extract disease severity prediction models from electronic medical records, and address aspects of timeliness, imprecision, and integrity. ML can be used to develop multi-method integrated models trained with large data input to prognosticate CRC patients by analyzing data pertaining to genetic and environmental variations amongst CRC patients and cancer-free controls[101]. Reliable multistage survival prediction models have also been developed for prognostication of patients with advanced CRC[102]. Semi-supervised learning methods use labeled or unlabeled data and graph regularization to predict patient survival and cancer recurrence. Recent literature has shown that the use of semi-supervised learning models can improve generalizability for CRC risk prediction models. A recent study employed a semi-supervised learning logistic regression model to establish a clinical prediction model of CRC survival risk that reportedly had good correction ability, popularization, interpretability, and clinical practicability after its performance was strictly compared with that of other existing supervised learning models[103]. Bychkov et al[104] trained a deep network using convolutional and recurrent architectures to predict outcomes based on tissue morphology. The model could accurately prognosticate these patients using small tissue areas as input and interestingly outperformed human experts on both tissue spot and whole slide levels across high and low-risk patient strata.

Chemotherapy plays a crucial role in influencing the prognosis of patients with both surgically removable and non-removable colorectal liver metastases (CRLM). AI models can predict chemotherapy response and early local tumor progression after ablative therapies in CRLM patients. Maaref et al[105] developed a DL- CNN that achieved an accuracy of 0.91 (95%CI: 0.88-0.93) for distinguishing treated and untreated lesions, and 0.78 (95%CI: 0.74-0.83) for predicting response to a chemotherapeutic regimen (AUC 0.82), and 92% sensitivity and 86% specificity for predicting therapeutic response in patients with HER2 amplification (AUC 0.78). Wei et al[106] used CECT imaging to train a DL radiomics model that accurately predicted CRLM response to different chemotherapuetic regimens with an AUC of 0.82 in the validation cohort. The predictive power increased to 0.93 when the DL model was used in conjunction with serum CEA level. Taghavi et al[107] developed predictive models using radiomics, and clinicopathological characteristics, that were highly predictive of CRLM development within the next 2 years with an AUC of 86% in the validation cohorts. These models have the potential to enhance oncological outcomes in CRLM patients, but further research is needed to elucidate and integrate these models into clinical practice[108].


AI-driven decision support systems can present relevant patient information, suggest appropriate treatment plans, and offer insights into potential clinical trial options. CRC risk prediction models can be useful in the identification of high-risk individuals who can benefit from closer cancer surveillance[68]. Routine use of CADe systems can significantly increase the colonoscopic yield of neoplastic lesions[69]. Augmentation of pre-operative MR imaging with ML algorithms to detect LNM can help identify candidates for neoadjuvant therapy in LARC cohort[83,84]. AI models can also help with surgical planning and guide surgeons intra-operatively by delineating the best oncologic resection plane(s)[92-94]. AI-based risk prediction models can yield accurate estimates for survival, cancer recurrence, and the likelihood of response to therapy[102,105-107].


Detection of nodal metastases using AI-based models has achieved only partial success so far. Also, the literature available on AI-based CRC management is sparse. The robustness of these AI algorithms depends on the amount and quality of data inputted. Unfortunately, several existing AI models are based on retrospective analyses, and have strict inclusion and exclusion criteria, thereby limiting widespread clinical application. Further well-powered prospective studies are needed to yield advanced iterations of these AI models, and test their performance in the real world.


Development of diagnostic and prognostic tools where AI interfaces with humans to enable accurate pathologic, endoscopic, and radiologic tumor detection and assessment. An integrative approach harnessing the computational efficacy of these novel algorithms for image-based diagnosis and prognostication, combined with existing clinical expertise could synergistically prove instrumental in reducing the miss rates of neoplastic lesions. AI can be pivotal in ushering in an era of personalized and targeted therapies based on specific tumor heterogeneity. However, we need well-powered, randomized trials to determine the diagnostic and prognostic potential of these algorithms. Challenges like interpretability, data scarcity, and generalization remain. AI models are often complex and lack transparency, making it difficult to understand how these decisions are made. These shortcomings can be addressed by training and validating AI models using large amounts of high-quality data. The gap between AI model development and clinical implementation needs to be closed. Successful integration of AI into clinical practice necessitates rigorous validation, collaboration between AI developers and medical experts, and adherence to ethical standards.


Provenance and peer review: Invited article; Externally peer reviewed.

Peer-review model: Single blind

Specialty type: Oncology

Country of origin: United States

Peer-review report’s scientific quality classification

Grade A (Excellent): 0

Grade B (Very good): 0

Grade C (Good): 0

Grade D (Fair): D

Grade E (Poor): 0

P-Reviewer: Liu S, China S-Editor: Liu JH L-Editor: A P-Editor: Cai YX

1.  Wu J, Chen J, Cai J. Application of Artificial Intelligence in Gastrointestinal Endoscopy. J Clin Gastroenterol. 2021;55:110-120.  [PubMed]  [DOI]  [Cited in This Article: ]
2.  Noorbakhsh-Sabet N, Zand R, Zhang Y, Abedi V. Artificial Intelligence Transforms the Future of Health Care. Am J Med. 2019;132:795-801.  [PubMed]  [DOI]  [Cited in This Article: ]
3.  Bray F, Ferlay J, Soerjomataram I, Siegel RL, Torre LA, Jemal A. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin. 2018;68:394-424.  [PubMed]  [DOI]  [Cited in This Article: ]
4.  Le Berre C, Sandborn WJ, Aridhi S, Devignes MD, Fournier L, Smaïl-Tabbone M, Danese S, Peyrin-Biroulet L. Application of Artificial Intelligence to Gastroenterology and Hepatology. Gastroenterology. 2020;158:76-94.e2.  [PubMed]  [DOI]  [Cited in This Article: ]
5.  Sheikh M, Roshandel G, McCormack V, Malekzadeh R. Current Status and Future Prospects for Esophageal Cancer. Cancers (Basel). 2023;15.  [PubMed]  [DOI]  [Cited in This Article: ]
6.  Zhang YH, Guo LJ, Yuan XL, Hu B. Artificial intelligence-assisted esophageal cancer management: Now and future. World J Gastroenterol. 2020;26:5256-5271.  [PubMed]  [DOI]  [Cited in This Article: ]
7.  Merchán Gómez B, Milla Collado L, Rodríguez M. Artificial intelligence in esophageal cancer diagnosis and treatment: where are we now?-a narrative review. Ann Transl Med. 2023;11:353.  [PubMed]  [DOI]  [Cited in This Article: ]
8.  Guidozzi N, Menon N, Chidambaram S, Markar SR. The role of artificial intelligence in the endoscopic diagnosis of esophageal cancer: a systematic review and meta-analysis. Dis Esophagus. 2023;36.  [PubMed]  [DOI]  [Cited in This Article: ]
9.  de Groof AJ, Struyvenberg MR, van der Putten J, van der Sommen F, Fockens KN, Curvers WL, Zinger S, Pouw RE, Coron E, Baldaque-Silva F, Pech O, Weusten B, Meining A, Neuhaus H, Bisschops R, Dent J, Schoon EJ, de With PH, Bergman JJ. Deep-Learning System Detects Neoplasia in Patients With Barrett's Esophagus With Higher Accuracy Than Endoscopists in a Multistep Training and Validation Study With Benchmarking. Gastroenterology. 2020;158:915-929.e4.  [PubMed]  [DOI]  [Cited in This Article: ]
10.  Islam MM, Poly TN, Walther BA, Yeh CY, Seyed-Abdul S, Li YJ, Lin MC. Deep Learning for the Diagnosis of Esophageal Cancer in Endoscopic Images: A Systematic Review and Meta-Analysis. Cancers (Basel). 2022;14.  [PubMed]  [DOI]  [Cited in This Article: ]
11.  Luo H, Xu G, Li C, He L, Luo L, Wang Z, Jing B, Deng Y, Jin Y, Li Y, Li B, Tan W, He C, Seeruttun SR, Wu Q, Huang J, Huang DW, Chen B, Lin SB, Chen QM, Yuan CM, Chen HX, Pu HY, Zhou F, He Y, Xu RH. Real-time artificial intelligence for detection of upper gastrointestinal cancer by endoscopy: a multicentre, case-control, diagnostic study. Lancet Oncol. 2019;20:1645-1654.  [PubMed]  [DOI]  [Cited in This Article: ]
12.  Cai SL, Li B, Tan WM, Niu XJ, Yu HH, Yao LQ, Zhou PH, Yan B, Zhong YS. Using a deep learning system in endoscopy for screening of early esophageal squamous cell carcinoma (with video). Gastrointest Endosc. 2019;90:745-753.e2.  [PubMed]  [DOI]  [Cited in This Article: ]
13.  Hashimoto R, Requa J, Dao T, Ninh A, Tran E, Mai D, Lugo M, El-Hage Chehade N, Chang KJ, Karnes WE, Samarasena JB. Artificial intelligence using convolutional neural networks for real-time detection of early esophageal neoplasia in Barrett's esophagus (with video). Gastrointest Endosc. 2020;91:1264-1271.e1.  [PubMed]  [DOI]  [Cited in This Article: ]
14.  Horie Y, Yoshio T, Aoyama K, Yoshimizu S, Horiuchi Y, Ishiyama A, Hirasawa T, Tsuchida T, Ozawa T, Ishihara S, Kumagai Y, Fujishiro M, Maetani I, Fujisaki J, Tada T. Diagnostic outcomes of esophageal cancer by artificial intelligence using convolutional neural networks. Gastrointest Endosc. 2019;89:25-32.  [PubMed]  [DOI]  [Cited in This Article: ]
15.  Wang YK, Syu HY, Chen YH, Chung CS, Tseng YS, Ho SY, Huang CW, Wu IC, Wang HC. Endoscopic Images by a Single-Shot Multibox Detector for the Identification of Early Cancerous Lesions in the Esophagus: A Pilot Study. Cancers (Basel). 2021;13.  [PubMed]  [DOI]  [Cited in This Article: ]
16.  Takeuchi M, Seto T, Hashimoto M, Ichihara N, Morimoto Y, Kawakubo H, Suzuki T, Jinzaki M, Kitagawa Y, Miyata H, Sakakibara Y. Performance of a deep learning-based identification system for esophageal cancer from CT images. Esophagus. 2021;18:612-620.  [PubMed]  [DOI]  [Cited in This Article: ]
17.  Zhao YY, Xue DX, Wang YL, Zhang R, Sun B, Cai YP, Feng H, Cai Y, Xu JM. Computer-assisted diagnosis of early esophageal squamous cell carcinoma using narrow-band imaging magnifying endoscopy. Endoscopy. 2019;51:333-341.  [PubMed]  [DOI]  [Cited in This Article: ]
18.  van der Putten J, Struyvenberg M, de Groof J, Scheeve T, Curvers W, Schoon E, Bergman JJGHM, de With PHN, van der Sommen F. Deep principal dimension encoding for the classification of early neoplasia in Barrett's Esophagus with volumetric laser endomicroscopy. Comput Med Imaging Graph. 2020;80:101701.  [PubMed]  [DOI]  [Cited in This Article: ]
19.  Pan Y, He L, Chen W, Yang Y. The current state of artificial intelligence in endoscopic diagnosis of early esophageal squamous cell carcinoma. Front Oncol. 2023;13:1198941.  [PubMed]  [DOI]  [Cited in This Article: ]
20.  Ebigbo A, Mendel R, Rückert T, Schuster L, Probst A, Manzeneder J, Prinz F, Mende M, Steinbrück I, Faiss S, Rauber D, de Souza LA Jr, Papa JP, Deprez PH, Oyama T, Takahashi A, Seewald S, Sharma P, Byrne MF, Palm C, Messmann H. Endoscopic prediction of submucosal invasion in Barrett's cancer with the use of artificial intelligence: a pilot study. Endoscopy. 2021;53:878-883.  [PubMed]  [DOI]  [Cited in This Article: ]
21.  Yang XX, Li Z, Shao XJ, Ji R, Qu JY, Zheng MQ, Sun YN, Zhou RC, You H, Li LX, Feng J, Yang XY, Li YQ, Zuo XL. Real-time artificial intelligence for endoscopic diagnosis of early esophageal squamous cell cancer (with video). Dig Endosc. 2021;33:1075-1084.  [PubMed]  [DOI]  [Cited in This Article: ]
22.  Zhang SM, Wang YJ, Zhang ST. Accuracy of artificial intelligence-assisted detection of esophageal cancer and neoplasms on endoscopic images: A systematic review and meta-analysis. J Dig Dis. 2021;22:318-328.  [PubMed]  [DOI]  [Cited in This Article: ]
23.  Akashi T, Okumura T, Terabayashi K, Yoshino Y, Tanaka H, Yamazaki T, Numata Y, Fukuda T, Manabe T, Baba H, Miwa T, Watanabe T, Hirano K, Igarashi T, Sekine S, Hashimoto I, Shibuya K, Hojo S, Yoshioka I, Matsui K, Yamada A, Sasaki T, Fujii T. The use of an artificial intelligence algorithm for circulating tumor cell detection in patients with esophageal cancer. Oncol Lett. 2023;26:320.  [PubMed]  [DOI]  [Cited in This Article: ]
24.  Zhang ST, Wang SY, Zhang J, Dong D, Mu W, Xia XE, Fu FF, Lu YN, Wang S, Tang ZC, Li P, Qu JR, Wang MY, Tian J, Liu JH. Artificial intelligence-based computer-aided diagnosis system supports diagnosis of lymph node metastasis in esophageal squamous cell carcinoma: A multicenter study. Heliyon. 2023;9:e14030.  [PubMed]  [DOI]  [Cited in This Article: ]
25.  Tan X, Ma Z, Yan L, Ye W, Liu Z, Liang C. Radiomics nomogram outperforms size criteria in discriminating lymph node metastasis in resectable esophageal squamous cell carcinoma. Eur Radiol. 2019;29:392-400.  [PubMed]  [DOI]  [Cited in This Article: ]
26.  Wu L, Yang X, Cao W, Zhao K, Li W, Ye W, Chen X, Zhou Z, Liu Z, Liang C. Multiple Level CT Radiomics Features Preoperatively Predict Lymph Node Metastasis in Esophageal Cancer: A Multicentre Retrospective Study. Front Oncol. 2019;9:1548.  [PubMed]  [DOI]  [Cited in This Article: ]
27.  Hosseini F, Asadi F, Emami H, Ebnali M. Machine learning applications for early detection of esophageal cancer: a systematic review. BMC Med Inform Decis Mak. 2023;23:124.  [PubMed]  [DOI]  [Cited in This Article: ]
28.  Li J, Li L, You P, Wei Y, Xu B. Towards artificial intelligence to multi-omics characterization of tumor heterogeneity in esophageal cancer. Semin Cancer Biol. 2023;91:35-49.  [PubMed]  [DOI]  [Cited in This Article: ]
29.  Siddique S, Chow JCL. Artificial intelligence in radiotherapy. Rep Pract Oncol Radiother. 2020;25:656-666.  [PubMed]  [DOI]  [Cited in This Article: ]
30.  Li X, Gao H, Zhu J, Huang Y, Zhu Y, Huang W, Li Z, Sun K, Liu Z, Tian J, Li B. 3D Deep Learning Model for the Pretreatment Evaluation of Treatment Response in Esophageal Carcinoma: A Prospective Study (ChiCTR2000039279). Int J Radiat Oncol Biol Phys. 2021;111:926-935.  [PubMed]  [DOI]  [Cited in This Article: ]
31.  Ypsilantis PP, Siddique M, Sohn HM, Davies A, Cook G, Goh V, Montana G. Predicting Response to Neoadjuvant Chemotherapy with PET Imaging Using Convolutional Neural Networks. PLoS One. 2015;10:e0137036.  [PubMed]  [DOI]  [Cited in This Article: ]
32.  Warnecke-Eberz U, Metzger R, Bollschweiler E, Baldus SE, Mueller RP, Dienes HP, Hoelscher AH, Schneider PM. TaqMan low-density arrays and analysis by artificial neuronal networks predict response to neoadjuvant chemoradiation in esophageal cancer. Pharmacogenomics. 2010;11:55-64.  [PubMed]  [DOI]  [Cited in This Article: ]
33.  Xu J, Zhou J, Hu J, Ren Q, Wang X, Shu Y. Development and validation of a machine learning model for survival risk stratification after esophageal cancer surgery. Front Oncol. 2022;12:1068198.  [PubMed]  [DOI]  [Cited in This Article: ]
34.  Wang S, Chen X, Fan J, Lu L. Prognostic Significance of Lymphovascular Invasion for Thoracic Esophageal Squamous Cell Carcinoma. Ann Surg Oncol. 2016;23:4101-4109.  [PubMed]  [DOI]  [Cited in This Article: ]
35.  Xu G, Feng F, Liu Z, Liu S, Zheng G, Xiao S, Cai L, Yang X, Li G, Lian X, Guo M, Sun L, Yang J, Fan D, Lu Q, Zhang H. Prognosis and Progression of ESCC Patients with Perineural Invasion. Sci Rep. 2017;7:43828.  [PubMed]  [DOI]  [Cited in This Article: ]
36.  Yeh JC, Yu WH, Yang CK, Chien LI, Lin KH, Huang WS, Hsu PK. Predicting aggressive histopathological features in esophageal cancer with positron emission tomography using a deep convolutional neural network. Ann Transl Med. 2021;9:37.  [PubMed]  [DOI]  [Cited in This Article: ]
37.  Yang CK, Yeh JC, Yu WH, Chien LI, Lin KH, Huang WS, Hsu PK. Deep Convolutional Neural Network-Based Positron Emission Tomography Analysis Predicts Esophageal Cancer Outcome. J Clin Med. 2019;8.  [PubMed]  [DOI]  [Cited in This Article: ]
38.  Cao R, Tang L, Fang M, Zhong L, Wang S, Gong L, Li J, Dong D, Tian J. Artificial intelligence in gastric cancer: applications and challenges. Gastroenterol Rep (Oxf). 2022;10:goac064.  [PubMed]  [DOI]  [Cited in This Article: ]
39.  Smyth EC, Nilsson M, Grabsch HI, van Grieken NC, Lordick F. Gastric cancer. Lancet. 2020;396:635-648.  [PubMed]  [DOI]  [Cited in This Article: ]
40.  Necula L, Matei L, Dragu D, Neagu AI, Mambet C, Nedeianu S, Bleotu C, Diaconu CC, Chivu-Economescu M. Recent advances in gastric cancer early diagnosis. World J Gastroenterol. 2019;25:2029-2044.  [PubMed]  [DOI]  [Cited in This Article: ]
41.  Guimarães P, Keller A, Fehlmann T, Lammert F, Casper M. Deep-learning based detection of gastric precancerous conditions. Gut. 2020;69:4-6.  [PubMed]  [DOI]  [Cited in This Article: ]
42.  Zhang Y, Li F, Yuan F, Zhang K, Huo L, Dong Z, Lang Y, Zhang Y, Wang M, Gao Z, Qin Z, Shen L. Diagnosing chronic atrophic gastritis by gastroscopy using artificial intelligence. Dig Liver Dis. 2020;52:566-572.  [PubMed]  [DOI]  [Cited in This Article: ]
43.  Yan T, Wong PK, Choi IC, Vong CM, Yu HH. Intelligent diagnosis of gastric intestinal metaplasia based on convolutional neural network and limited number of endoscopic images. Comput Biol Med. 2020;126:104026.  [PubMed]  [DOI]  [Cited in This Article: ]
44.  Xiao Z, Ji D, Li F, Li Z, Bao Z. Application of Artificial Intelligence in Early Gastric Cancer Diagnosis. Digestion. 2022;103:69-75.  [PubMed]  [DOI]  [Cited in This Article: ]
45.  Yuan XL, Zhou Y, Liu W, Luo Q, Zeng XH, Yi Z, Hu B. Artificial intelligence for diagnosing gastric lesions under white-light endoscopy. Surg Endosc. 2022;36:9444-9453.  [PubMed]  [DOI]  [Cited in This Article: ]
46.  Goto A, Kubota N, Nishikawa J, Ogawa R, Hamabe K, Hashimoto S, Ogihara H, Hamamoto Y, Yanai H, Miura O, Takami T. Cooperation between artificial intelligence and endoscopists for diagnosing invasion depth of early gastric cancer. Gastric Cancer. 2023;26:116-122.  [PubMed]  [DOI]  [Cited in This Article: ]
47.  Ma M, Li Z, Yu T, Liu G, Ji R, Li G, Guo Z, Wang L, Qi Q, Yang X, Qu J, Wang X, Zuo X, Ren H, Li Y. Application of deep learning in the real-time diagnosis of gastric lesion based on magnifying optical enhancement videos. Front Oncol. 2022;12:945904.  [PubMed]  [DOI]  [Cited in This Article: ]
48.  Ma Z, Fang M, Huang Y, He L, Chen X, Liang C, Huang X, Cheng Z, Dong D, Xie J, Tian J, Liu Z. CT-based radiomics signature for differentiating Borrmann type IV gastric cancer from primary gastric lymphoma. Eur J Radiol. 2017;91:142-147.  [PubMed]  [DOI]  [Cited in This Article: ]
49.  Feng B, Huang L, Liu Y, Chen Y, Zhou H, Yu T, Xue H, Chen Q, Zhou T, Kuang Q, Yang Z, Chen X, Peng Z, Long W. A Transfer Learning Radiomics Nomogram for Preoperative Prediction of Borrmann Type IV Gastric Cancer From Primary Gastric Lymphoma. Front Oncol. 2021;11:802205.  [PubMed]  [DOI]  [Cited in This Article: ]
50.  Zhu Z, Gong Y, Xu H. Clinical and pathological staging of gastric cancer: Current perspectives and implications. Eur J Surg Oncol. 2020;46:e14-e19.  [PubMed]  [DOI]  [Cited in This Article: ]
51.  Ajani JA, D'Amico TA, Bentrem DJ, Chao J, Cooke D, Corvera C, Das P, Enzinger PC, Enzler T, Fanta P, Farjah F, Gerdes H, Gibson MK, Hochwald S, Hofstetter WL, Ilson DH, Keswani RN, Kim S, Kleinberg LR, Klempner SJ, Lacy J, Ly QP, Matkowskyj KA, McNamara M, Mulcahy MF, Outlaw D, Park H, Perry KA, Pimiento J, Poultsides GA, Reznik S, Roses RE, Strong VE, Su S, Wang HL, Wiesner G, Willett CG, Yakoub D, Yoon H, McMillian N, Pluchino LA. Gastric Cancer, Version 2.2022, NCCN Clinical Practice Guidelines in Oncology. J Natl Compr Canc Netw. 2022;20:167-192.  [PubMed]  [DOI]  [Cited in This Article: ]
52.  Kim KW, Huh J, Urooj B, Lee J, Lee IS, Park H, Na S, Ko Y. Artificial Intelligence in Gastric Cancer Imaging With Emphasis on Diagnostic Imaging and Body Morphometry. J Gastric Cancer. 2023;23:388-399.  [PubMed]  [DOI]  [Cited in This Article: ]
53.  Alsina M, Arrazubi V, Diez M, Tabernero J. Current developments in gastric cancer: from molecular profiling to treatment strategy. Nat Rev Gastroenterol Hepatol. 2023;20:155-170.  [PubMed]  [DOI]  [Cited in This Article: ]
54.  Kubota K, Kuroda J, Yoshida M, Ohta K, Kitajima M. Medical image analysis: computer-aided diagnosis of gastric cancer invasion on endoscopic images. Surg Endosc. 2012;26:1485-1489.  [PubMed]  [DOI]  [Cited in This Article: ]
55.  Zhu Y, Wang QC, Xu MD, Zhang Z, Cheng J, Zhong YS, Zhang YQ, Chen WF, Yao LQ, Zhou PH, Li QL. Application of convolutional neural network in the diagnosis of the invasion depth of gastric cancer based on conventional endoscopy. Gastrointest Endosc. 2019;89:806-815.e1.  [PubMed]  [DOI]  [Cited in This Article: ]
56.  Yoon HJ, Kim S, Kim JH, Keum JS, Oh SI, Jo J, Chun J, Youn YH, Park H, Kwon IG, Choi SH, Noh SH. A Lesion-Based Convolutional Neural Network Improves Endoscopic Detection and Depth Prediction of Early Gastric Cancer. J Clin Med. 2019;8.  [PubMed]  [DOI]  [Cited in This Article: ]
57.  Nagao S, Tsuji Y, Sakaguchi Y, Takahashi Y, Minatsuki C, Niimi K, Yamashita H, Yamamichi N, Seto Y, Tada T, Koike K. Highly accurate artificial intelligence systems to predict the invasion depth of gastric cancer: efficacy of conventional white-light imaging, nonmagnifying narrow-band imaging, and indigo-carmine dye contrast imaging. Gastrointest Endosc. 2020;92:866-873.e1.  [PubMed]  [DOI]  [Cited in This Article: ]
58.  Li C, Qin Y, Zhang WH, Jiang H, Song B, Bashir MR, Xu H, Duan T, Fang M, Zhong L, Meng L, Dong D, Hu Z, Tian J, Hu JK. Deep learning-based AI model for signet-ring cell carcinoma diagnosis and chemotherapy response prediction in gastric cancer. Med Phys. 2022;49:1535-1546.  [PubMed]  [DOI]  [Cited in This Article: ]
59.  Sundar R, Barr Kumarakulasinghe N, Huak Chan Y, Yoshida K, Yoshikawa T, Miyagi Y, Rino Y, Masuda M, Guan J, Sakamoto J, Tanaka S, Tan AL, Hoppe MM, Jeyasekharan AD, Ng CCY, De Simone M, Grabsch HI, Lee J, Oshima T, Tsuburaya A, Tan P. Machine-learning model derived gene signature predictive of paclitaxel survival benefit in gastric cancer: results from the randomised phase III SAMIT trial. Gut. 2022;71:676-685.  [PubMed]  [DOI]  [Cited in This Article: ]
60.  Briggs E, de Kamps M, Hamilton W, Johnson O, McInerney CD, Neal RD. Machine Learning for Risk Prediction of Oesophago-Gastric Cancer in Primary Care: Comparison with Existing Risk-Assessment Tools. Cancers (Basel). 2022;14.  [PubMed]  [DOI]  [Cited in This Article: ]
61.  Que SJ, Chen QY, Qing-Zhong, Liu ZY, Wang JB, Lin JX, Lu J, Cao LL, Lin M, Tu RH, Huang ZN, Lin JL, Zheng HL, Li P, Zheng CH, Huang CM, Xie JW. Application of preoperative artificial neural network based on blood biomarkers and clinicopathological parameters for predicting long-term survival of patients with gastric cancer. World J Gastroenterol. 2019;25:6451-6464.  [PubMed]  [DOI]  [Cited in This Article: ]
62.  Oh SE, Seo SW, Choi MG, Sohn TS, Bae JM, Kim S. Prediction of Overall Survival and Novel Classification of Patients with Gastric Cancer Using the Survival Recurrent Network. Ann Surg Oncol. 2018;25:1153-1159.  [PubMed]  [DOI]  [Cited in This Article: ]
63.  Zhang L, Dong D, Zhang W, Hao X, Fang M, Wang S, Li W, Liu Z, Wang R, Zhou J, Tian J. A deep learning risk prediction model for overall survival in patients with gastric cancer: A multicenter study. Radiother Oncol. 2020;150:73-80.  [PubMed]  [DOI]  [Cited in This Article: ]
64.  Jiang Y, Zhang Z, Yuan Q, Wang W, Wang H, Li T, Huang W, Xie J, Chen C, Sun Z, Yu J, Xu Y, Poultsides GA, Xing L, Zhou Z, Li G, Li R. Predicting peritoneal recurrence and disease-free survival from CT images in gastric cancer with multitask deep learning: a retrospective study. Lancet Digit Health. 2022;4:e340-e350.  [PubMed]  [DOI]  [Cited in This Article: ]
65.  Sun Z, Wang W, Huang W, Zhang T, Chen C, Yuan Q, Chen Y, Zhou K, Han Z, Feng H, Chen H, Liang X, Hu Y, Yu J, Liu H, Yu L, Xu Y, Li G, Jiang Y. Noninvasive imaging evaluation of peritoneal recurrence and chemotherapy benefit in gastric cancer after gastrectomy: a multicenter study. Int J Surg. 2023;109:2010-2024.  [PubMed]  [DOI]  [Cited in This Article: ]
66.  Cheong JH, Wang SC, Park S, Porembka MR, Christie AL, Kim H, Kim HS, Zhu H, Hyung WJ, Noh SH, Hu B, Hong C, Karalis JD, Kim IH, Lee SH, Hwang TH. Development and validation of a prognostic and predictive 32-gene signature for gastric cancer. Nat Commun. 2022;13:774.  [PubMed]  [DOI]  [Cited in This Article: ]
67.  Kuwayama N, Hoshino I, Mori Y, Yokota H, Iwatate Y, Uno T. Applying artificial intelligence using routine clinical data for preoperative diagnosis and prognosis evaluation of gastric cancer. Oncol Lett. 2023;26:499.  [PubMed]  [DOI]  [Cited in This Article: ]
68.  Burnett B, Zhou SM, Brophy S, Davies P, Ellis P, Kennedy J, Bandyopadhyay A, Parker M, Lyons RA. Machine Learning in Colorectal Cancer Risk Prediction from Routinely Collected Data: A Review. Diagnostics (Basel). 2023;13.  [PubMed]  [DOI]  [Cited in This Article: ]
69.  Roshan A, Byrne MF. Artificial intelligence in colorectal cancer screening. CMAJ. 2022;194:E1481-E1484.  [PubMed]  [DOI]  [Cited in This Article: ]
70.  Lou S, Du F, Song W, Xia Y, Yue X, Yang D, Cui B, Liu Y, Han P. Artificial intelligence for colorectal neoplasia detection during colonoscopy: a systematic review and meta-analysis of randomized clinical trials. EClinicalMedicine. 2023;66:102341.  [PubMed]  [DOI]  [Cited in This Article: ]
71.  Gilabert P, Vitrià J, Laiz P, Malagelada C, Watson A, Wenzek H, Segui S. Artificial intelligence to improve polyp detection and screening time in colon capsule endoscopy. Front Med (Lausanne). 2022;9:1000726.  [PubMed]  [DOI]  [Cited in This Article: ]
72.  Xu H, Tang RSY, Lam TYT, Zhao G, Lau JYW, Liu Y, Wu Q, Rong L, Xu W, Li X, Wong SH, Cai S, Wang J, Liu G, Ma T, Liang X, Mak JWY, Xu H, Yuan P, Cao T, Li F, Ye Z, Shutian Z, Sung JJY. Artificial Intelligence-Assisted Colonoscopy for Colorectal Cancer Screening: A Multicenter Randomized Controlled Trial. Clin Gastroenterol Hepatol. 2023;21:337-346.e3.  [PubMed]  [DOI]  [Cited in This Article: ]
73.  Koh FH, Ladlad J; SKH Endoscopy Centre, Teo EK, Lin CL, Foo FJ. Real-time artificial intelligence (AI)-aided endoscopy improves adenoma detection rates even in experienced endoscopists: a cohort study in Singapore. Surg Endosc. 2023;37:165-171.  [PubMed]  [DOI]  [Cited in This Article: ]
74.  Samarasena J, Yang D, Berzin TM. AGA Clinical Practice Update on the Role of Artificial Intelligence in Colon Polyp Diagnosis and Management: Commentary. Gastroenterology. 2023;165:1568-1573.  [PubMed]  [DOI]  [Cited in This Article: ]
75.  Lui TKL, Leung WK. Is artificial intelligence the final answer to missed polyps in colonoscopy? World J Gastroenterol. 2020;26:5248-5255.  [PubMed]  [DOI]  [Cited in This Article: ]
76.  Baxter NN, Virnig DJ, Rothenberger DA, Morris AM, Jessurun J, Virnig BA. Lymph node evaluation in colorectal cancer patients: a population-based study. J Natl Cancer Inst. 2005;97:219-225.  [PubMed]  [DOI]  [Cited in This Article: ]
77.  Iannicelli E, Di Renzo S, Ferri M, Pilozzi E, Di Girolamo M, Sapori A, Ziparo V, David V. Accuracy of high-resolution MRI with lumen distention in rectal cancer staging and circumferential margin involvement prediction. Korean J Radiol. 2014;15:37-44.  [PubMed]  [DOI]  [Cited in This Article: ]
78.  Fernandez LM, Parlade AJ, Wasser EJ, Dasilva G, de Azevedo RU, Ortega CD, Perez RO, Habr-Gama A, Berho M, Wexner SD. How Reliable Is CT Scan in Staging Right Colon Cancer? Dis Colon Rectum. 2019;62:960-964.  [PubMed]  [DOI]  [Cited in This Article: ]
79.  Benson AB, Venook AP, Al-Hawary MM, Cederquist L, Chen YJ, Ciombor KK, Cohen S, Cooper HS, Deming D, Engstrom PF, Grem JL, Grothey A, Hochster HS, Hoffe S, Hunt S, Kamel A, Kirilcuk N, Krishnamurthi S, Messersmith WA, Meyerhardt J, Mulcahy MF, Murphy JD, Nurkin S, Saltz L, Sharma S, Shibata D, Skibber JM, Sofocleous CT, Stoffel EM, Stotsky-Himelfarb E, Willett CG, Wuthrick E, Gregory KM, Gurski L, Freedman-Cass DA. Rectal Cancer, Version 2.2018, NCCN Clinical Practice Guidelines in Oncology. J Natl Compr Canc Netw. 2018;16:874-901.  [PubMed]  [DOI]  [Cited in This Article: ]
80.  Morton D. FOxTROT: An international randomised controlled trial in 1053 patients evaluating neoadjuvant chemotherapy (NAC) for colon cancer. On behalf of the FOxTROT Collaborative Group. Ann Oncol. 2019;30: v198.  [PubMed]  [DOI]  [Cited in This Article: ]
81.  Ito N, Kawahira H, Nakashima H, Uesato M, Miyauchi H, Matsubara H. Endoscopic Diagnostic Support System for cT1b Colorectal Cancer Using Deep Learning. Oncology. 2019;96:44-50.  [PubMed]  [DOI]  [Cited in This Article: ]
82.  Ichimasa K, Kudo SE, Mori Y, Misawa M, Matsudaira S, Kouyama Y, Baba T, Hidaka E, Wakamura K, Hayashi T, Kudo T, Ishigaki T, Yagawa Y, Nakamura H, Takeda K, Haji A, Hamatani S, Mori K, Ishida F, Miyachi H. Artificial intelligence may help in predicting the need for additional surgery after endoscopic resection of T1 colorectal cancer. Endoscopy. 2018;50:230-240.  [PubMed]  [DOI]  [Cited in This Article: ]
83.  Lu Y, Yu Q, Gao Y, Zhou Y, Liu G, Dong Q, Ma J, Ding L, Yao H, Zhang Z, Xiao G, An Q, Wang G, Xi J, Yuan W, Lian Y, Zhang D, Zhao C, Yao Q, Liu W, Zhou X, Liu S, Wu Q, Xu W, Zhang J, Wang D, Sun Z, Zhang X, Hu J, Zhang M, Zheng X, Wang L, Zhao J, Yang S. Identification of Metastatic Lymph Nodes in MR Imaging with Faster Region-Based Convolutional Neural Networks. Cancer Res. 2018;78:5135-5143.  [PubMed]  [DOI]  [Cited in This Article: ]
84.  Ding L, Liu GW, Zhao BC, Zhou YP, Li S, Zhang ZD, Guo YT, Li AQ, Lu Y, Yao HW, Yuan WT, Wang GY, Zhang DL, Wang L. Artificial intelligence system of faster region-based convolutional neural network surpassing senior radiologists in evaluation of metastatic lymph nodes of rectal cancer. Chin Med J (Engl). 2019;132:379-387.  [PubMed]  [DOI]  [Cited in This Article: ]
85.  Kang J, Choi YJ, Kim IK, Lee HS, Kim H, Baik SH, Kim NK, Lee KY. LASSO-Based Machine Learning Algorithm for Prediction of Lymph Node Metastasis in T1 Colorectal Cancer. Cancer Res Treat. 2021;53:773-783.  [PubMed]  [DOI]  [Cited in This Article: ]
86.  Li Y, Eresen A, Shangguan J, Yang J, Lu Y, Chen D, Wang J, Velichko Y, Yaghmai V, Zhang Z. Establishment of a new non-invasive imaging prediction model for liver metastasis in colon cancer. Am J Cancer Res. 2019;9:2482-2492.  [PubMed]  [DOI]  [Cited in This Article: ]
87.  Trebeschi S, van Griethuysen JJM, Lambregts DMJ, Lahaye MJ, Parmar C, Bakers FCH, Peters NHGM, Beets-Tan RGH, Aerts HJWL. Deep Learning for Fully-Automated Localization and Segmentation of Rectal Cancer on Multiparametric MR. Sci Rep. 2017;7:5301.  [PubMed]  [DOI]  [Cited in This Article: ]
88.  Wang J, Lu J, Qin G, Shen L, Sun Y, Ying H, Zhang Z, Hu W. Technical Note: A deep learning-based autosegmentation of rectal tumors in MR images. Med Phys. 2018;45:2560-2564.  [PubMed]  [DOI]  [Cited in This Article: ]
89.  Ding L, Liu G, Zhang X, Liu S, Li S, Zhang Z, Guo Y, Lu Y. A deep learning nomogram kit for predicting metastatic lymph nodes in rectal cancer. Cancer Med. 2020;9:8809-8820.  [PubMed]  [DOI]  [Cited in This Article: ]
90.  Bedrikovetski S, Dudi-Venkata NN, Maicas G, Kroon HM, Seow W, Carneiro G, Moore JW, Sammour T. Artificial intelligence for the diagnosis of lymph node metastases in patients with abdominopelvic malignancy: A systematic review and meta-analysis. Artif Intell Med. 2021;113:102022.  [PubMed]  [DOI]  [Cited in This Article: ]
91.  Garcia-Granero A, Jerí Mc-Farlane S, Gamundí Cuesta M, González-Argente FX. Application of 3D-reconstruction and artificial intelligence for complete mesocolic excision and D3 lymphadenectomy in colon cancer. Cir Esp (Engl Ed). 2023;101:359-368.  [PubMed]  [DOI]  [Cited in This Article: ]
92.  Kitaguchi D, Takeshita N, Matsuzaki H, Oda T, Watanabe M, Mori K, Kobayashi E, Ito M. Automated laparoscopic colorectal surgery workflow recognition using artificial intelligence: Experimental research. Int J Surg. 2020;79:88-94.  [PubMed]  [DOI]  [Cited in This Article: ]
93.  Kitaguchi D, Takeshita N, Matsuzaki H, Hasegawa H, Igaki T, Oda T, Ito M. Deep learning-based automatic surgical step recognition in intraoperative videos for transanal total mesorectal excision. Surg Endosc. 2022;36:1143-1151.  [PubMed]  [DOI]  [Cited in This Article: ]
94.  Kolbinger FR, Bodenstedt S, Carstens M, Leger S, Krell S, Rinner FM, Nielen TP, Kirchberg J, Fritzmann J, Weitz J, Distler M, Speidel S. Artificial Intelligence for context-aware surgical guidance in complex robot-assisted oncological procedures: An exploratory feasibility study. Eur J Surg Oncol. 2023;106996.  [PubMed]  [DOI]  [Cited in This Article: ]
95.  Igaki T, Kitaguchi D, Kojima S, Hasegawa H, Takeshita N, Mori K, Kinugasa Y, Ito M. Artificial Intelligence-Based Total Mesorectal Excision Plane Navigation in Laparoscopic Colorectal Surgery. Dis Colon Rectum. 2022;65:e329-e333.  [PubMed]  [DOI]  [Cited in This Article: ]
96.  Cruz S, Gomes SE, Borralho PM, Rodrigues CMP, Gaudêncio SP, Pereira F. In Silico HCT116 Human Colon Cancer Cell-Based Models En Route to the Discovery of Lead-Like Anticancer Drugs. Biomolecules. 2018;8.  [PubMed]  [DOI]  [Cited in This Article: ]
97.  Berishvili VP, Voronkov AE, Radchenko EV, Palyulin VA. Machine Learning Classification Models to Improve the Docking-based Screening: A Case of PI3K-Tankyrase Inhibitors. Mol Inform. 2018;37:e1800030.  [PubMed]  [DOI]  [Cited in This Article: ]
98.  Torchilin VP. Passive and active drug targeting: drug delivery to tumors as an example. Handb Exp Pharmacol. 2010;3-53.  [PubMed]  [DOI]  [Cited in This Article: ]
99.  Martel S, Mohammadi M. Switching between Magnetotactic and Aerotactic Displacement Controls to Enhance the Efficacy of MC-1 Magneto-Aerotactic Bacteria as Cancer-Fighting Nanorobots. Micromachines (Basel). 2016;7.  [PubMed]  [DOI]  [Cited in This Article: ]
100.  Ferrari R, Mancini-Terracciano C, Voena C, Rengo M, Zerunian M, Ciardiello A, Grasso S, Mare' V, Paramatti R, Russomando A, Santacesaria R, Satta A, Solfaroli Camillocci E, Faccini R, Laghi A. MR-based artificial intelligence model to assess response to therapy in locally advanced rectal cancer. Eur J Radiol. 2019;118:1-9.  [PubMed]  [DOI]  [Cited in This Article: ]
101.  Zhang L, Zheng C, Li T, Xing L, Zeng H, Yang H, Cao J, Chen B, Zhou Z. Building up a robust risk mathematical platform to predict colorectal cancer. Complexity. 2017;2017:1-14.  [PubMed]  [DOI]  [Cited in This Article: ]
102.  Wang Y, Wang D, Ye X, Wang Y, Yin Y, Jin Y. A tree ensemble-based two-stage model for advanced-stage colorectal cancer survival prediction. Information Sciences. 2019;474:106-124.  [PubMed]  [DOI]  [Cited in This Article: ]
103.  Chi S, Li X, Tian Y, Li J, Kong X, Ding K, Weng C. Semi-supervised learning to improve generalizability of risk prediction models. J Biomed Inform. 2019;92:103117.  [PubMed]  [DOI]  [Cited in This Article: ]
104.  Bychkov D, Linder N, Turkki R, Nordling S, Kovanen PE, Verrill C, Walliander M, Lundin M, Haglund C, Lundin J. Deep learning based tissue analysis predicts outcome in colorectal cancer. Sci Rep. 2018;8:3395.  [PubMed]  [DOI]  [Cited in This Article: ]
105.  Maaref A, Romero FP, Montagnon E, Cerny M, Nguyen B, Vandenbroucke F, Soucy G, Turcotte S, Tang A, Kadoury S. Predicting the Response to FOLFOX-Based Chemotherapy Regimen from Untreated Liver Metastases on Baseline CT: a Deep Neural Network Approach. J Digit Imaging. 2020;33:937-945.  [PubMed]  [DOI]  [Cited in This Article: ]
106.  Wei J, Cheng J, Gu D, Chai F, Hong N, Wang Y, Tian J. Deep learning-based radiomics predicts response to chemotherapy in colorectal liver metastases. Med Phys. 2021;48:513-522.  [PubMed]  [DOI]  [Cited in This Article: ]
107.  Taghavi M, Trebeschi S, Simões R, Meek DB, Beckers RCJ, Lambregts DMJ, Verhoef C, Houwers JB, van der Heide UA, Beets-Tan RGH, Maas M. Machine learning-based analysis of CT radiomics model for prediction of colorectal metachronous liver metastases. Abdom Radiol (NY). 2021;46:249-256.  [PubMed]  [DOI]  [Cited in This Article: ]
108.  Rompianesi G, Pegoraro F, Ceresa CD, Montalti R, Troisi RI. Artificial intelligence in the diagnosis and management of colorectal cancer liver metastases. World J Gastroenterol. 2022;28:108-122.  [PubMed]  [DOI]  [Cited in This Article: ]