Published online Apr 28, 2021. doi: 10.35713/aic.v2.i2.12
Peer-review started: March 9, 2021
First decision: March 26, 2021
Revised: April 2, 2021
Accepted: April 20, 2021
Article in press: April 20, 2021
Published online: April 28, 2021
Processing time: 48 Days and 4.6 Hours
Early diagnosis and timely treatment are crucial in reducing cancer-related mortality. Artificial intelligence (AI) has greatly relieved clinical workloads and changed the current medical workflows. We searched for recent studies, reports and reviews referring to AI and solid tumors; many reviews have summarized AI applications in the diagnosis and treatment of a single tumor type. We herein systematically review the advances of AI application in multiple solid tumors including esophagus, stomach, intestine, breast, thyroid, prostate, lung, liver, cervix, pancreas and kidney with a specific focus on the continual improvement on model performance in imaging practice.
Core Tip: Many reviews have summarized artificial intelligence applications in the diagnosis and treatment of a single tumor type. However, this is the first review to systematically review how artificial intelligence relieves clinical workloads and changes the current medical workflows while maintaining high quality to provide precision medicine in multiple solid tumors. Due to its clear advantage in imaging practice, patients will benefit from early diagnosis and appropriate treatment.
- Citation: Shao Y, Zhang YX, Chen HH, Lu SS, Zhang SC, Zhang JX. Advances in the application of artificial intelligence in solid tumor imaging. Artif Intell Cancer 2021; 2(2): 12-24
- URL: https://www.wjgnet.com/2644-3228/full/v2/i2/12.htm
- DOI: https://dx.doi.org/10.35713/aic.v2.i2.12
Cancer is currently a worldwide health problem. Early diagnosis and timely treatment are crucial in reducing cancer-related mortality. Medical imaging is a common technique used to guide the clinical diagnosis of solid tumors. Accurate interpretation of imaging data has become an important but difficult task in the diagnosis process.
Artificial intelligence (AI) refers to an information science that researches and develops theories, methods, technologies and application systems used to simulate, expand and extend human intelligence[1]. With the rapid development of machine learning, deep learning and other crucial AI technologies in the field of image processing in recent years, these approaches have made great contributions to disease classification, prognosis prediction and therapy evaluation and can identify patterns that humans cannot recognize[2-4] (Figure 1). Here, we review the advantage of AI applications in imaging examinations of multiple solid tumors and highlight its great benefits in optimizing the clinical work process, providing accurate tumor assessment for current precision medicine and achieving better diagnosis and treatment results based on its practical data and literature reports.
Gastric cancer is one of the most common gastrointestinal malignancies at present, with a poor prognosis and high mortality. Endoscopy and pathological biopsy are still the “gold standard” for the diagnosis of gastric cancer, but they have shortcomings[5]. For example, the sensitivity of endoscopic diagnosis of atrophic gastritis is only 42%, so the rate of missed diagnosis is relatively high[6]. Multipoint biopsy sampling increases the risk of tissue injury and gastrorrhagia[7,8]. Some advanced endoscopic techniques, such as color endoscopy combined with magnification endoscopy and laser confocal microscopy, can provide only images of the mucosal surface of the gastrointestinal tract[7-9]. Billah et al[10] used capsule endoscopy along with a convo
Narrow-band imaging (NBI) is an emerging advanced, noninvasive endoscopic technology that can strengthen the evaluation of the surface structure and microvas
The CNN-chronic atrophic gastritis approach developed by Zhang et al[7] has a good classification performance for recognizing chronic atrophic gastritis based on gastric antrum images whose area under the curve (AUC) was close to 0.99. The accuracy, sensitivity and specificity of CNN-chronic atrophic gastritis in the field of atrophic gastritis diagnosis are all above 0.94. In this study, 1458 mild cases, 1348 moderate cases and 38 severe cases of atrophic gastritis were tested by the CNN model, and the accuracy rates were 0.93, 0.95 and 0.99, respectively, indicating good consistency of the CNN model recognition with the clinical diagnosis of atrophic gastritis.
However, the literature has reported that AI technology used for stomach cancer or esophageal stomach adenocarcinoma is susceptible to problems related to tumor morphology, atrophic change, uneven mucosal background, etc., which leads to low specificity and high false positive rate (FPR)[17]. Several studies indicated that the application of AI in the clinic has high accuracy. If AI technology is combined with endoscopy doctors, then endoscopy can help doctors better diagnose atrophic gastritis, increase the rate of early gastric cancer diagnosis and avoid unnecessary pathological biopsy[18,19].
Regarding small early gastric tumors, Abe et al[18] showed that AI technology can find anomalies faster than endoscopy doctors (45.5 s vs 173.0 min), and it also shows higher sensitivity (58.4% vs 31.9%). However, the positive predictive value (PPV) and specificity of AI technology were relatively lower than those of endoscopy doctors (26.0% vs 46.2% and 87.3% vs 97.0%, respectively)[18]. A computer-aided design (CAD) system is used in stationary images of magnifying endoscopy combined with NBI, which have an accuracy rate for early gastric cancer diagnosis of 85.3%[20]. When endoscopy cannot identify and capture images of lesions, magnifying endoscopy combined with NBI video in the CAD system can help the real-time clinical diagnosis of early gastric cancer. Horiuchi et al[19] proposed that the diagnostic performance of the CAD system using magnifying endoscopy combined with NBI video is equal to or better than that of 11 experienced endoscopic experts in early gastric cancer. The AUC was 0.8684, and its accuracy, sensitivity, specificity, PPV and negative predictive value were 85.1%, 87.4%, 82.8%, 83.5% and 86.7%, respectively[19].
Colorectal colonoscopy is the key technique for the diagnosis of colorectal polyps. However, several studies have shown that 15.4% of colorectal lesions (≤ 3 mm) were diagnosed as adenomas under endoscopy but were judged as normal mucosa via pathological examination[21]. Intraobserver and interobserver discrepancies are the main problem[22]. Therefore, some studies have suggested that using AI techniques combined with endoscopy and imaging may help physicians identify colorectal lesions and perform pathological classification and prognosis prediction[22].
Shahidi et al[21] established a real-time AI-based clinical DSS to assess the diffe
Wang et al[23] explored the feasibility of faster region-based CNN technology. They used transfer learning technology and images and features of the ImageNet VGG16 model to automatically identify the positive circumferential resection margin in high-resolution magnetic resonance imaging (MRI) of rectal cancer, and the accuracy, sensitivity and specificity were 93.2%, 83.8% and 95.6%, respectively[23]. The use of 18F fluorodeoxyglucose-positron emission tomography (PET)/computed tomography (CT) to assess early changes in glucose metabolism parameters during neoadjuvant chemotherapy can predict treatment efficacy[24,25]. Traditional 18F fluorodeoxyglu
Ultrasound and radiology are common imaging techniques in breast examination for cancer screening, diagnosis and treatment. Ultrasound is important for the nonin
Zhou et al[29] proposed a CNN-based deep learning model to predict lymph node metastasis according to the characteristics of primary breast cancer under ultrasound. The data showed that its AUC was approximately 90%, and the sensitivity and specificity were above 80% and 70%, respectively. Mango et al[30] integrated their AI-based decision support system into ultrasonic images, and the results showed that this technique is helpful in Breast Imaging Reporting and Data System classification, reducing the intraobserver and interobserver variabilities. The variability incidence of ultrasound only in Breast Imaging Reporting and Data System 3 to Breast Imaging Reporting and Data System 4A or above was 13.6%, and it decreased to 10.8% when ultrasound was combined with decision support.
Spick et al[34] showed that adding diffusion-weighted imaging into MRI-guided vacuum-assisted breast biopsy could reduce the FPR by more than 30%. Penco et al[32] verified the accuracy of MRI-guided vacuum-assisted breast biopsy in comparison with histopathological results. The results exhibited 94% accuracy, 84% sensitivity and 77% specificity, with a negative predictive value of up to 97%. Adachi et al[31] com
Sasaki et al[35] proposed that AI-based Transpara systems reduced the differences between computers and experts in the detection sensitivity to breast cancer via molyb
In summary, AI technology increases the detection sensitivity of latent breast lesions while maintaining higher specificity. This technology also reduces the variability in interpretation and helps to improve the clinical diagnostic performance.
In recent years, with the increasing incidence rate of thyroid cancer, the accurate classification of thyroid lesions and the prediction of lymph node metastasis have been prioritized to be the core of clinical intervention[37,38]. Ultrasound is a noninvasive, easily accessible and economical examination tool, but its accuracy may vary accor
Barczyński et al[39] verified that the S-DetectTM model in real-time CAD system had no significant difference from experienced radiologists in sensitivity, accuracy and negative predictive value of thyroid tumor classification. The overall accuracy of disease evaluation was 76% for surgical doctors who had basic ultrasonic skills not using the CAD system but 82% for doctors with experience using the CAD system[39]. The sensitivity and negative predictive value of lesion classification by the CAD system was similar to those by ultrasonic experts. It further helped to locate the thyroid nodules for further puncture cytology. Nevertheless, the S-DetectTM model had defects in identifying calcifications[40].
Postoperative lymph node metastasis is a key factor in the local recurrence of thyroid carcinoma. It is necessary to use CT or ultrasound to judge whether lymph node metastasis is present before surgery[37,38]. A study conducted by Lee et al[41] confirmed that the AUC of the CAD system based on deep learning in the classifi
Serum prostate specific antigen (PSA), digital rectal examination and transrectal prostate ultrasound-guided prostate puncture are the main methods for the early diagnosis of prostate cancer[42]. High-level PSA (> 2 ng/mL) is an important indicator of postoperative monitoring and identifying the recurrence of prostate cancer[43].
Biopsy technology guided by MRI/ultrasound improves the clinical detection of prostate cancer[44,45]. MRI detects pathological changes of Prostate Imaging Repor
Deep learning applications in the field of prostate malignant tumors have been widely used with MRI[47,48]. Although some patients were treated with radical prostate surgery and serum prostate specific antigen < 1, 11C-choline PET/CT still showed a 20.5% positive rate[49]. Prostate uptake of 18F-choline is associated with the overall survival rate, making it as important as serum prostate specific antigen and Gleason scores in identifying high-risk and low-risk patients. Polymeri et al[50] used an automatic estimation method based on deep learning, and the obtained 18F-choline uptake value (71 mL) could reach radiologists’ visual estimates (65 mL and 80 mL) within seconds. This approach significantly improved the accuracy and precision of PET/CT imaging in the diagnosis of prostate cancer.
Raciti et al[43] used the software Paige Prostate Alpha to significantly increase the detection rate of prostate cancer while maintaining high specificity. Especially for small, poorly differentiated tumors, the sensitivity can be increased to 30% up to 90%. Similar AI systems can also be used to detect micrometastases in prostate cancer.
When using CT to screen pulmonary nodules, lung-Reporting and Data System can increase sensitivity, but its FPR is also high[51]. The CAD method has 100% sensitivity, but its specificity is extremely low (up to 8.2 false positive nodules per scan)[51]. The negative predictive value of PET/CT for lymph node lesions of peripheral T1 tumors (≤ 3 cm) is as high as 92%-94%[52].
Chauvie et al[51] attempted to apply new methods to digital tomosynthesis: (1) Binomial visual analysis, PPV (0.14) and sensitivity (0.95); (2) Pulmonary-Reporting and Data System, PPV (0.19) and sensitivity (0.65); (3) Logistic regression, PPV (0.29) and sensitivity (0.20); (4) Random forest, PPV (0.40) and sensitivity (0.30); and (5) Neural network, PPV (0.95) and sensitivity (0.90). These data indicated that the neural network was the only predictor of lung cancer with a high PPV value and no loss in sensitivity. Tau et al[52] used CNN to analyze the characteristics of the primary tumor based on PET and to evaluate the existence of lymph node metastasis in newly diagnosed non-small cell lung cancer patients. The sensitivity, specificity and accuracy of predicting positive lymph nodes were 0.74 ± 0.32, 0.84 ± 0.16 and 0.80 ± 0.17, respectively; those of predicting distal metastasis were 0.45 ± 0.08, 0.79 ± 0.06 and 0.63 ± 0.05, respectively. The sensitivity of predicting distant lymph node metastasis was low (24% at prophase and 45% at the end of the monitoring period). CNN had high specificity (91% in the M1 group and 79% in the follow-up group), but the PPV and negative predictive value in class M were lower at the end of follow-up (54.5% and 68.6%).
The texture analysis of contrast-enhanced magnetic resonance is considered an image tag for predicting the early reaction of hepatocellular carcinoma patients before transarterial chemoembolization (TACE) treatment[53]. Its accuracy for the evaluation of complete remission and incomplete remission was 0.76. Preoperative dynamic CT texture analysis in the prediction of hepatocellular carcinoma response to TACE treatment has certain value. Peng et al[54] used a CT-based deep learning technique (transfer learning) that compensated for the inaccuracy of the result caused by insufficient image information. Further studies showed that the three groups (one training set and two validation sets) of data showed a high AUC for predicting the response to TACE treatment: complete response (0.97, 0.98, 0.97), partial response (0.96, 0.96, 0.96), stable condition (0.95, 0.95, 0.94) and disease progression (0.96, 0.94, 0.97); simultaneously, the accuracy reached 84.0%, 85.1% and 82.8%[54]. Therefore, the CT-based deep learning model helps physicians preliminarily estimate the initial response of hepatocellular carcinoma patients to TACE treatment and helps to predict the therapeutic effect of TACE.
Colposcopy is widely used in the detection of cervical intraepithelial neoplasia, and it can guide cervical biopsy in women suspected of having cytological abnormalities or human papillomavirus infection[55,56]. In low- and middle-income countries with a lack of tools for colposcopy, the diagnostic accuracy of cervical biopsy to detect cervical intraepithelial neoplasia is quite low (30%-70%)[57]. The development and application of AI-guided (e.g., support vector machine) digital colposcopy helped solve the bottlenecks and improved the screening effectiveness of cervical cancer to better understand the characteristics of cervical lesions[58]. Another advantage of AI is the “real-time” diagnosis report, which continues to optimize clinical workflows[58].
Accurate segmentation of the pancreas is important to AI training and AI assisted guidance. Wolz et al[59] used multi atlas technology, which only achieved a dice similarity coefficient (DSC) of 0.70. Summers et al[60] used deep learning technology, which reached a DSC of 0.78%. Wang et al[61] proposed that interactive fully convo
Histopathology is the gold standard for clear cell renal cell carcinoma evaluation[63]. The World Health Organization/International Society of Urological Pathology grading system is used to predict the prognosis of renal clear cell carcinoma[64-66]. Using CT or MRI indications to describe the grading of clear cell renal cell carcinoma is often influenced by subjective factors[67-70]. Cui et al[71] studied the machine learning algorithm to extract and analyze the profiles of tiny tumors. Further grading predic
AI has clear characteristics of high efficiency, specificity and sensitivity in the classification, identification and diagnosis of solid tumor. After its integration into imaging technology, AI optimizes clinical workflows, decreases the discrepancy between the readers and reduces the misdiagnosis rate, which helps clinicians effectively choose appropriate therapeutic strategies and accurately predict the prognosis (Table 1). All these improvements bring great advantages and convenience to current precision medicine. Nevertheless, problems still exist. For example, the FPR increases due to the morphology of the tumors or the uneven mucosal background and the identification failure of calcification because of technical defects. Therefore, AI cannot be a complete replacement of humans in the contemporary situation. We believe that with the continuous improvement of AI technology, the application of AI in tumor diagnosis and treatment will have better prospects in tumors not limited only to solid tumors.
Publish date | Ref. | AI | Application scenarios | Sensitivity | Accuracy | Specificity | PPV | NPV | Detection time | Variation | Volume | AUC | DSC |
10/2020 | Fukuda et al[16] | CNN | Diagnosis of esophagus squamous cell cancer | 91.1% | 88.3% | ||||||||
05/2020 | Zhang et al[7] | CNN | Diagnosis of chronic atrophic gastritis | 94.5% | 94.2% | 94.0% | 0.99 | ||||||
10/2020 | Horiuchi et al[19] | CAD | Diagnosis of early gastric cancer | 87.4% | 85.1% | 82.8% | 83.5% | 86.7% | 0.8684 | ||||
02/2020 | Wang et al[23] | Faster R-CNN | Circumferential resection margin of rectal cancer | 83.8% | 93.2% | 95.6% | |||||||
03/2020 | Shen et al[28] | RF | Pathological complete response of rectal cancer | 95.3% | |||||||||
01/2021 | Abe et al[18] | CNN | Diagnosis of gastric cancer | 58.4% | 87.3% | 26.0% | 45.5 s | ||||||
01/2020 | Zhou et al[29] | CNN | Lymph node metastasis prediction from primary breast cancer | > 80% | > 70% | 0.9 | |||||||
03/2020 | Penco et al[32] | DWI | MRI-guided vacuum-assisted breast biopsy | 84.0% | 94.0% | 77.0% | 97.0% | ||||||
05/2020 | Adachi et al[31] | RetinaNet | Diagnosis of breast cancer | 92.6% | 82.8% | 0.925 | |||||||
Readers without RetinaNet | 84.7% | 84.1% | 0.884 | ||||||||||
Readers with RetinaNet | 88.9% | 82.3% | 0.899 | ||||||||||
02/2020 | Sasaki et al[35] | Experts | Diagnosis of breast cancer | 89.0% | |||||||||
Experts with Transpara system | 95.0% | ||||||||||||
06/2020 | Mango et al[30] | US | Diagnosis of BI-RADS 3 to BI-RADS 4A or above of breast cancer | 13.6% | |||||||||
US+DS | 10.8% | ||||||||||||
02/2020 | Barczyński et al[39] | Doctors without CAD | Classification of thyroid tumor | 76.0% | |||||||||
Doctors with CAD | 82.0% | ||||||||||||
06/2020 | Lee et al[41] | CAD | Diagnosis of thyroid neck lymph node metastasis | 80.2% | 82.8% | 83.0% | 83.0% | 80.2% | 0.884 | ||||
03/2020 | Polymeri et al[50] | CNN | Prostate gland uptake in PET/CT | 71 mL | |||||||||
10/2020 | Raciti et al[43] | Paige Prostate Alpha | Diagnosis of prostate cancer | 90.0% | |||||||||
07/2020 | Chauvie et al[51] | Binomial visual analysis | Lung DTS | 95.0% | 14.0% | ||||||||
Pulmonary-RADS | 65.0% | 19.0% | |||||||||||
Logistic regression | 20.0% | 29.0% | |||||||||||
RF | 30.0% | 40.0% | |||||||||||
Neural network | 90.0% | 95.0% | |||||||||||
07/2020 | Tau et al[52] | CNN | Diagnosis of lymph node metastasis of lung cancer | 74% ± 32% | 80% ± 17% | 84% ± 16% | |||||||
Predicting of distal metastasis of lung cancer | 45% ± 8% | 63% ± 5% | 79% ± 6% | 54.5% | 68.6% | ||||||||
01/2020 | Peng et al[54] | Transfer learning | Predicting of TACE treatment response of hepatocellular carcinoma | > 82.8% | > 0.94 | ||||||||
09/2013 | Wolz et al[59] | Multi atlas technology | Segmentation of the pancreas | 70.0% | |||||||||
08/2020 | Gibson et al[62] | Deep learning technology | 78.0% | ||||||||||
iFCN | 72.3% ± 11.4% | ||||||||||||
Artificial segmentation | 15 min to 87.5% DSC |
Manuscript source: Invited manuscript
Specialty type: Methodology
Country/Territory of origin: China
Peer-review report’s scientific quality classification
Grade A (Excellent): 0
Grade B (Very good): 0
Grade C (Good): C
Grade D (Fair): D
Grade E (Poor): 0
P-Reviewer: Hong YY, Liu GH S-Editor: Wang JL L-Editor: Filipodia P-Editor: Yuan YY
1. | Zhou B, Xu JW, Cheng YG, Gao JY, Hu SY, Wang L, Zhan HX. Early detection of pancreatic cancer: Where are we now and where are we going? Int J Cancer. 2017;141:231-241. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 118] [Cited by in F6Publishing: 127] [Article Influence: 18.1] [Reference Citation Analysis (0)] |
2. | Ding MQ, Chen L, Cooper GF, Young JD, Lu X. Precision Oncology beyond Targeted Therapy: Combining Omics Data with Machine Learning Matches the Majority of Cancer Cells to Effective Therapeutics. Mol Cancer Res. 2018;16:269-278. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 81] [Cited by in F6Publishing: 89] [Article Influence: 12.7] [Reference Citation Analysis (0)] |
3. | Bibault JE, Giraud P, Burgun A. Big Data and machine learning in radiation oncology: State of the art and future prospects. Cancer Lett. 2016;382:110-117. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 179] [Cited by in F6Publishing: 171] [Article Influence: 21.4] [Reference Citation Analysis (0)] |
4. | Fröhlich H, Balling R, Beerenwinkel N, Kohlbacher O, Kumar S, Lengauer T, Maathuis MH, Moreau Y, Murphy SA, Przytycka TM, Rebhan M, Röst H, Schuppert A, Schwab M, Spang R, Stekhoven D, Sun J, Weber A, Ziemek D, Zupan B. From hype to reality: data science enabling personalized medicine. BMC Med. 2018;16:150. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 260] [Cited by in F6Publishing: 196] [Article Influence: 32.7] [Reference Citation Analysis (0)] |
5. | Feng W, Ding Y, Zong W, Ju S. Non-coding RNAs in regulating gastric cancer metastasis. Clin Chim Acta. 2019;496:125-133. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 16] [Cited by in F6Publishing: 16] [Article Influence: 3.2] [Reference Citation Analysis (0)] |
6. | Du Y, Bai Y, Xie P, Fang J, Wang X, Hou X, Tian D, Wang C, Liu Y, Sha W, Wang B, Li Y, Zhang G, Shi R, Xu J, Huang M, Han S, Liu J, Ren X, Wang Z, Cui L, Sheng J, Luo H, Zhao X, Dai N, Nie Y, Zou Y, Xia B, Fan Z, Chen Z, Lin S, Li ZS; Chinese Chronic Gastritis Research group. Chronic gastritis in China: a national multi-center survey. BMC Gastroenterol. 2014;14:21. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 55] [Cited by in F6Publishing: 82] [Article Influence: 8.2] [Reference Citation Analysis (0)] |
7. | Zhang Y, Li F, Yuan F, Zhang K, Huo L, Dong Z, Lang Y, Zhang Y, Wang M, Gao Z, Qin Z, Shen L. Diagnosing chronic atrophic gastritis by gastroscopy using artificial intelligence. Dig Liver Dis. 2020;52:566-572. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 45] [Cited by in F6Publishing: 66] [Article Influence: 16.5] [Reference Citation Analysis (0)] |
8. | Guimarães P, Keller A, Fehlmann T, Lammert F, Casper M. Deep-learning based detection of gastric precancerous conditions. Gut. 2020;69:4-6. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 58] [Cited by in F6Publishing: 69] [Article Influence: 17.3] [Reference Citation Analysis (0)] |
9. | Liu T, Zheng H, Gong W, Chen C, Jiang B. The accuracy of confocal laser endomicroscopy, narrow band imaging, and chromoendoscopy for the detection of atrophic gastritis. J Clin Gastroenterol. 2015;49:379-386. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 21] [Cited by in F6Publishing: 24] [Article Influence: 2.7] [Reference Citation Analysis (0)] |
10. | Billah M, Waheed S, Rahman MM. An Automatic Gastrointestinal Polyp Detection System in Video Endoscopy Using Fusion of Color Wavelet and Convolutional Neural Network Features. Int J Biomed Imaging. 2017;2017:9545920. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 63] [Cited by in F6Publishing: 71] [Article Influence: 10.1] [Reference Citation Analysis (0)] |
11. | Urban G, Tripathi P, Alkayali T, Mittal M, Jalali F, Karnes W, Baldi P. Deep Learning Localizes and Identifies Polyps in Real Time With 96% Accuracy in Screening Colonoscopy. Gastroenterology 2018; 155: 1069-1078. e8. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 398] [Cited by in F6Publishing: 391] [Article Influence: 65.2] [Reference Citation Analysis (1)] |
12. | Lahner E, Grossi E, Intraligi M, Buscema M, Corleto VD, Delle Fave G, Annibale B. Possible contribution of artificial neural networks and linear discriminant analysis in recognition of patients with suspected atrophic body gastritis. World J Gastroenterol. 2005;11:5867-5873. [PubMed] [DOI] [Cited in This Article: ] [Cited by in CrossRef: 17] [Cited by in F6Publishing: 15] [Article Influence: 0.8] [Reference Citation Analysis (0)] |
13. | Muto M, Minashi K, Yano T, Saito Y, Oda I, Nonaka S, Omori T, Sugiura H, Goda K, Kaise M, Inoue H, Ishikawa H, Ochiai A, Shimoda T, Watanabe H, Tajiri H, Saito D. Early detection of superficial squamous cell carcinoma in the head and neck region and esophagus by narrow band imaging: a multicenter randomized controlled trial. J Clin Oncol. 2010;28:1566-1572. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 427] [Cited by in F6Publishing: 495] [Article Influence: 35.4] [Reference Citation Analysis (0)] |
14. | Ohmori M, Ishihara R, Aoyama K, Nakagawa K, Iwagami H, Matsuura N, Shichijo S, Yamamoto K, Nagaike K, Nakahara M, Inoue T, Aoi K, Okada H, Tada T. Endoscopic detection and differentiation of esophageal lesions using a deep neural network. Gastrointest Endosc 2020; 91: 301-309. e1. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 74] [Cited by in F6Publishing: 72] [Article Influence: 18.0] [Reference Citation Analysis (0)] |
15. | Ishihara R, Takeuchi Y, Chatani R, Kidu T, Inoue T, Hanaoka N, Yamamoto S, Higashino K, Uedo N, Iishi H, Tatsuta M, Tomita Y, Ishiguro S. Prospective evaluation of narrow-band imaging endoscopy for screening of esophageal squamous mucosal high-grade neoplasia in experienced and less experienced endoscopists. Dis Esophagus. 2010;23:480-486. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 80] [Cited by in F6Publishing: 87] [Article Influence: 6.2] [Reference Citation Analysis (0)] |
16. | Fukuda H, Ishihara R, Kato Y, Matsunaga T, Nishida T, Yamada T, Ogiyama H, Horie M, Kinoshita K, Tada T. Comparison of performances of artificial intelligence versus expert endoscopists for real-time assisted diagnosis of esophageal squamous cell carcinoma (with video). Gastrointest Endosc. 2020;92:848-855. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 45] [Cited by in F6Publishing: 52] [Article Influence: 13.0] [Reference Citation Analysis (0)] |
17. | Iwagami H, Ishihara R, Aoyama K, Fukuda H, Shimamoto Y, Kono M, Nakahira H, Matsuura N, Shichijo S, Kanesaka T, Kanzaki H, Ishii T, Nakatani Y, Tada T. Artificial intelligence for the detection of esophageal and esophagogastric junctional adenocarcinoma. J Gastroenterol Hepatol. 2021;36:131-136. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 16] [Cited by in F6Publishing: 15] [Article Influence: 5.0] [Reference Citation Analysis (0)] |
18. | Abe S, Oda I. How can endoscopists adapt and collaborate with artificial intelligence for early gastric cancer detection? Dig Endosc. 2021;33:98-99. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 5] [Cited by in F6Publishing: 4] [Article Influence: 1.3] [Reference Citation Analysis (0)] |
19. | Horiuchi Y, Hirasawa T, Ishizuka N, Tokai Y, Namikawa K, Yoshimizu S, Ishiyama A, Yoshio T, Tsuchida T, Fujisaki J, Tada T. Performance of a computer-aided diagnosis system in diagnosing early gastric cancer using magnifying endoscopy videos with narrow-band imaging (with videos). Gastrointest Endosc 2020; 92: 856-865. e1. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 40] [Cited by in F6Publishing: 46] [Article Influence: 11.5] [Reference Citation Analysis (0)] |
20. | Horiuchi Y, Aoyama K, Tokai Y, Hirasawa T, Yoshimizu S, Ishiyama A, Yoshio T, Tsuchida T, Fujisaki J, Tada T. Convolutional Neural Network for Differentiating Gastric Cancer from Gastritis Using Magnified Endoscopy with Narrow Band Imaging. Dig Dis Sci. 2020;65:1355-1363. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 70] [Cited by in F6Publishing: 83] [Article Influence: 20.8] [Reference Citation Analysis (1)] |
21. | Shahidi N, Rex DK, Kaltenbach T, Rastogi A, Ghalehjegh SH, Byrne MF. Use of Endoscopic Impression, Artificial Intelligence, and Pathologist Interpretation to Resolve Discrepancies Between Endoscopy and Pathology Analyses of Diminutive Colorectal Polyps. Gastroenterology 2020; 158: 783-785. e1. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 27] [Cited by in F6Publishing: 26] [Article Influence: 6.5] [Reference Citation Analysis (0)] |
22. | Yang YJ, Cho BJ, Lee MJ, Kim JH, Lim H, Bang CS, Jeong HM, Hong JT, Baik GH. Automated Classification of Colorectal Neoplasms in White-Light Colonoscopy Images via Deep Learning. J Clin Med. 2020;9. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 33] [Cited by in F6Publishing: 22] [Article Influence: 5.5] [Reference Citation Analysis (0)] |
23. | Wang D, Xu J, Zhang Z, Li S, Zhang X, Zhou Y, Lu Y. Evaluation of Rectal Cancer Circumferential Resection Margin Using Faster Region-Based Convolutional Neural Network in High-Resolution Magnetic Resonance Images. Dis Colon Rectum. 2020;63:143-151. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 14] [Cited by in F6Publishing: 22] [Article Influence: 5.5] [Reference Citation Analysis (0)] |
24. | Guillem JG, Moore HG, Akhurst T, Klimstra DS, Ruo L, Mazumdar M, Minsky BD, Saltz L, Wong WD, Larson S. Sequential preoperative fluorodeoxyglucose-positron emission tomography assessment of response to preoperative chemoradiation: a means for determining longterm outcomes of rectal cancer. J Am Coll Surg. 2004;199:1-7. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 150] [Cited by in F6Publishing: 160] [Article Influence: 8.0] [Reference Citation Analysis (0)] |
25. | Capirci C, Rampin L, Erba PA, Galeotti F, Crepaldi G, Banti E, Gava M, Fanti S, Mariani G, Muzzio PC, Rubello D. Sequential FDG-PET/CT reliably predicts response of locally advanced rectal cancer to neo-adjuvant chemo-radiation therapy. Eur J Nucl Med Mol Imaging. 2007;34:1583-1593. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 130] [Cited by in F6Publishing: 119] [Article Influence: 7.0] [Reference Citation Analysis (0)] |
26. | Joye I, Deroose CM, Vandecaveye V, Haustermans K. The role of diffusion-weighted MRI and (18)F-FDG PET/CT in the prediction of pathologic complete response after radiochemotherapy for rectal cancer: a systematic review. Radiother Oncol. 2014;113:158-165. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 122] [Cited by in F6Publishing: 108] [Article Influence: 12.0] [Reference Citation Analysis (0)] |
27. | Williams JK. Using random forests to diagnose aviation turbulence. Mach Learn. 2014;95:51-70. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 63] [Cited by in F6Publishing: 48] [Article Influence: 4.4] [Reference Citation Analysis (0)] |
28. | Shen WC, Chen SW, Wu KC, Lee PY, Feng CL, Hsieh TC, Yen KY, Kao CH. Predicting pathological complete response in rectal cancer after chemoradiotherapy with a random forest using 18F-fluorodeoxyglucose positron emission tomography and computed tomography radiomics. Ann Transl Med. 2020;8:207. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 12] [Cited by in F6Publishing: 21] [Article Influence: 5.3] [Reference Citation Analysis (0)] |
29. | Zhou LQ, Wu XL, Huang SY, Wu GG, Ye HR, Wei Q, Bao LY, Deng YB, Li XR, Cui XW, Dietrich CF. Lymph Node Metastasis Prediction from Primary Breast Cancer US Images Using Deep Learning. Radiology. 2020;294:19-28. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 195] [Cited by in F6Publishing: 170] [Article Influence: 42.5] [Reference Citation Analysis (0)] |
30. | Mango VL, Sun M, Wynn RT, Ha R. Should We Ignore, Follow, or Biopsy? AJR Am J Roentgenol. 2020;214:1445-1452. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 34] [Cited by in F6Publishing: 48] [Article Influence: 12.0] [Reference Citation Analysis (0)] |
31. | Adachi M, Fujioka T, Mori M, Kubota K, Kikuchi Y, Xiaotong W, Oyama J, Kimura K, Oda G, Nakagawa T, Uetake H, Tateishi U. Detection and Diagnosis of Breast Cancer Using Artificial Intelligence Based assessment of Maximum Intensity Projection Dynamic Contrast-Enhanced Magnetic Resonance Images. Diagnostics (Basel). 2020;10. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 19] [Cited by in F6Publishing: 26] [Article Influence: 6.5] [Reference Citation Analysis (0)] |
32. | Penco S, Rotili A, Pesapane F, Trentin C, Dominelli V, Faggian A, Farina M, Marinucci I, Bozzini A, Pizzamiglio M, Ierardi AM, Cassano E. MRI-guided vacuum-assisted breast biopsy: experience of a single tertiary referral cancer centre and prospects for the future. Med Oncol. 2020;37:36. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 4] [Cited by in F6Publishing: 4] [Article Influence: 1.0] [Reference Citation Analysis (0)] |
33. | Hu Y, Zhang Y, Cheng J. Diagnostic value of molybdenum target combined with DCE-MRI in different types of breast cancer. Oncol Lett. 2019;18:4056-4063. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 5] [Cited by in F6Publishing: 7] [Article Influence: 1.4] [Reference Citation Analysis (0)] |
34. | Spick C, Pinker-Domenig K, Rudas M, Helbich TH, Baltzer PA. MRI-only lesions: application of diffusion-weighted imaging obviates unnecessary MR-guided breast biopsies. Eur Radiol. 2014;24:1204-1210. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 76] [Cited by in F6Publishing: 79] [Article Influence: 7.9] [Reference Citation Analysis (0)] |
35. | Sasaki M, Tozaki M, Rodríguez-Ruiz A, Yotsumoto D, Ichiki Y, Terawaki A, Oosako S, Sagara Y. Artificial intelligence for breast cancer detection in mammography: experience of use of the ScreenPoint Medical Transpara system in 310 Japanese women. Breast Cancer. 2020;27:642-651. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 19] [Cited by in F6Publishing: 26] [Article Influence: 6.5] [Reference Citation Analysis (0)] |
36. | Rodriguez-Ruiz A, Lång K, Gubern-Merida A, Broeders M, Gennaro G, Clauser P, Helbich TH, Chevalier M, Tan T, Mertelmeier T, Wallis MG, Andersson I, Zackrisson S, Mann RM, Sechopoulos I. Stand-Alone Artificial Intelligence for Breast Cancer Detection in Mammography: Comparison With 101 Radiologists. J Natl Cancer Inst. 2019;111:916-922. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 211] [Cited by in F6Publishing: 300] [Article Influence: 75.0] [Reference Citation Analysis (0)] |
37. | Shin JH, Baek JH, Chung J, Ha EJ, Kim JH, Lee YH, Lim HK, Moon WJ, Na DG, Park JS, Choi YJ, Hahn SY, Jeon SJ, Jung SL, Kim DW, Kim EK, Kwak JY, Lee CY, Lee HJ, Lee JH, Lee KH, Park SW, Sung JY; Korean Society of Thyroid Radiology (KSThR) and Korean Society of Radiology. Ultrasonography Diagnosis and Imaging-Based Management of Thyroid Nodules: Revised Korean Society of Thyroid Radiology Consensus Statement and Recommendations. Korean J Radiol. 2016;17:370-395. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 509] [Cited by in F6Publishing: 622] [Article Influence: 77.8] [Reference Citation Analysis (0)] |
38. | Haugen BR. 2015 American Thyroid Association Management Guidelines for Adult Patients with Thyroid Nodules and Differentiated Thyroid Cancer: What is new and what has changed? Cancer. 2017;123:372-381. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 273] [Cited by in F6Publishing: 381] [Article Influence: 47.6] [Reference Citation Analysis (0)] |
39. | Barczyński M, Stopa-Barczyńska M, Wojtczak B, Czarniecka A, Konturek A. Clinical validation of S-DetectTM mode in semi-automated ultrasound classification of thyroid lesions in surgical office. Gland Surg. 2020;9:S77-S85. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 10] [Cited by in F6Publishing: 18] [Article Influence: 4.5] [Reference Citation Analysis (0)] |
40. | Kim HL, Ha EJ, Han M. Real-World Performance of Computer-Aided Diagnosis System for Thyroid Nodules Using Ultrasonography. Ultrasound Med Biol. 2019;45:2672-2678. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 39] [Cited by in F6Publishing: 32] [Article Influence: 6.4] [Reference Citation Analysis (0)] |
41. | Lee JH, Ha EJ, Kim D, Jung YJ, Heo S, Jang YH, An SH, Lee K. Application of deep learning to the diagnosis of cervical lymph node metastasis from thyroid cancer with CT: external validation and clinical utility for resident training. Eur Radiol. 2020;30:3066-3072. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 21] [Cited by in F6Publishing: 45] [Article Influence: 11.3] [Reference Citation Analysis (0)] |
42. | Kroenig M, Schaal K, Benndorf M, Soschynski M, Lenz P, Krauss T, Drendel V, Kayser G, Kurz P, Werner M, Wetterauer U, Schultze-Seemann W, Langer M, Jilg CA. Diagnostic Accuracy of Robot-Guided, Software Based Transperineal MRI/TRUS Fusion Biopsy of the Prostate in a High Risk Population of Previously Biopsy Negative Men. Biomed Res Int. 2016;2016:2384894. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 13] [Cited by in F6Publishing: 18] [Article Influence: 2.3] [Reference Citation Analysis (0)] |
43. | Raciti P, Sue J, Ceballos R, Godrich R, Kunz JD, Kapur S, Reuter V, Grady L, Kanan C, Klimstra DS, Fuchs TJ. Novel artificial intelligence system increases the detection of prostate cancer in whole slide images of core needle biopsies. Mod Pathol. 2020;33:2058-2066. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 60] [Cited by in F6Publishing: 99] [Article Influence: 24.8] [Reference Citation Analysis (0)] |
44. | Siddiqui MM, Rais-Bahrami S, Turkbey B, George AK, Rothwax J, Shakir N, Okoro C, Raskolnikov D, Parnes HL, Linehan WM, Merino MJ, Simon RM, Choyke PL, Wood BJ, Pinto PA. Comparison of MR/ultrasound fusion-guided biopsy with ultrasound-guided biopsy for the diagnosis of prostate cancer. JAMA. 2015;313:390-397. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 1068] [Cited by in F6Publishing: 1127] [Article Influence: 125.2] [Reference Citation Analysis (0)] |
45. | Kasivisvanathan V, Rannikko AS, Borghi M, Panebianco V, Mynderse LA, Vaarala MH, Briganti A, Budäus L, Hellawell G, Hindley RG, Roobol MJ, Eggener S, Ghei M, Villers A, Bladou F, Villeirs GM, Virdi J, Boxler S, Robert G, Singh PB, Venderink W, Hadaschik BA, Ruffion A, Hu JC, Margolis D, Crouzet S, Klotz L, Taneja SS, Pinto P, Gill I, Allen C, Giganti F, Freeman A, Morris S, Punwani S, Williams NR, Brew-Graves C, Deeks J, Takwoingi Y, Emberton M, Moore CM; PRECISION Study Group Collaborators. MRI-Targeted or Standard Biopsy for Prostate-Cancer Diagnosis. N Engl J Med. 2018;378:1767-1777. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 1568] [Cited by in F6Publishing: 1954] [Article Influence: 325.7] [Reference Citation Analysis (0)] |
46. | Sanford T, Harmon SA, Turkbey EB, Kesani D, Tuncer S, Madariaga M, Yang C, Sackett J, Mehralivand S, Yan P, Xu S, Wood BJ, Merino MJ, Pinto PA, Choyke PL, Turkbey B. Deep-Learning-Based Artificial Intelligence for PI-RADS Classification to Assist Multiparametric Prostate MRI Interpretation: A Development Study. J Magn Reson Imaging. 2020;52:1499-1507. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 57] [Cited by in F6Publishing: 54] [Article Influence: 13.5] [Reference Citation Analysis (0)] |
47. | Reda I, Khalil A, Elmogy M, Abou El-Fetouh A, Shalaby A, Abou El-Ghar M, Elmaghraby A, Ghazal M, El-Baz A. Deep Learning Role in Early Diagnosis of Prostate Cancer. Technol Cancer Res Treat. 2018;17:1533034618775530. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 40] [Cited by in F6Publishing: 45] [Article Influence: 9.0] [Reference Citation Analysis (0)] |
48. | Wang X, Yang W, Weinreb J, Han J, Li Q, Kong X, Yan Y, Ke Z, Luo B, Liu T, Wang L. Searching for prostate cancer by fully automated magnetic resonance imaging classification: deep learning vs non-deep learning. Sci Rep. 2017;7:15415. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 112] [Cited by in F6Publishing: 94] [Article Influence: 13.4] [Reference Citation Analysis (0)] |
49. | Giovacchini G, Guglielmo P, Mapelli P, Incerti E, Gajate AMS, Giovannini E, Riondato M, Briganti A, Gianolli L, Ciarmiello A, Picchio M. 11C-choline PET/CT predicts survival in prostate cancer patients with PSA < 1 NG/mL. Eur J Nucl Med Mol Imaging. 2019;46:921-929. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 11] [Cited by in F6Publishing: 14] [Article Influence: 2.8] [Reference Citation Analysis (0)] |
50. | Polymeri E, Sadik M, Kaboteh R, Borrelli P, Enqvist O, Ulén J, Ohlsson M, Trägårdh E, Poulsen MH, Simonsen JA, Hoilund-Carlsen PF, Johnsson ÅA, Edenbrandt L. Deep learning-based quantification of PET/CT prostate gland uptake: association with overall survival. Clin Physiol Funct Imaging. 2020;40:106-113. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 20] [Cited by in F6Publishing: 24] [Article Influence: 4.8] [Reference Citation Analysis (0)] |
51. | Chauvie S, De Maggi A, Baralis I, Dalmasso F, Berchialla P, Priotto R, Violino P, Mazza F, Melloni G, Grosso M; SOS Study team. Artificial intelligence and radiomics enhance the positive predictive value of digital chest tomosynthesis for lung cancer detection within SOS clinical trial. Eur Radiol. 2020;30:4134-4140. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 11] [Cited by in F6Publishing: 16] [Article Influence: 4.0] [Reference Citation Analysis (1)] |
52. | Tau N, Stundzia A, Yasufuku K, Hussey D, Metser U. Convolutional Neural Networks in Predicting Nodal and Distant Metastatic Potential of Newly Diagnosed Non-Small Cell Lung Cancer on FDG PET Images. AJR Am J Roentgenol. 2020;215:192-197. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 23] [Cited by in F6Publishing: 23] [Article Influence: 5.8] [Reference Citation Analysis (0)] |
53. | Yu JY, Zhang HP, Tang ZY, Zhou J, He XJ, Liu YY, Liu XJ, Guo DJ. Value of texture analysis based on enhanced MRI for predicting an early therapeutic response to transcatheter arterial chemoembolisation combined with high-intensity focused ultrasound treatment in hepatocellular carcinoma. Clin Radiol 2018; 73: 758.e9-758. e18. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 25] [Cited by in F6Publishing: 33] [Article Influence: 5.5] [Reference Citation Analysis (0)] |
54. | Peng J, Kang S, Ning Z, Deng H, Shen J, Xu Y, Zhang J, Zhao W, Li X, Gong W, Huang J, Liu L. Residual convolutional neural network for predicting response of transarterial chemoembolization in hepatocellular carcinoma from CT imaging. Eur Radiol. 2020;30:413-424. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 69] [Cited by in F6Publishing: 103] [Article Influence: 20.6] [Reference Citation Analysis (0)] |
55. | Khan MJ, Werner CL, Darragh TM, Guido RS, Mathews C, Moscicki AB, Mitchell MM, Schiffman M, Wentzensen N, Massad LS, Mayeaux EJ Jr, Waxman AG, Conageski C, Einstein MH, Huh WK. ASCCP Colposcopy Standards: Role of Colposcopy, Benefits, Potential Harms, and Terminology for Colposcopic Practice. J Low Genit Tract Dis. 2017;21:223-229. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 50] [Cited by in F6Publishing: 74] [Article Influence: 12.3] [Reference Citation Analysis (0)] |
56. | Mayeaux EJ Jr, Novetsky AP, Chelmow D, Garcia F, Choma K, Liu AH, Papasozomenos T, Einstein MH, Massad LS, Wentzensen N, Waxman AG, Conageski C, Khan MJ, Huh WK. ASCCP Colposcopy Standards: Colposcopy Quality Improvement Recommendations for the United States. J Low Genit Tract Dis. 2017;21:242-248. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 15] [Cited by in F6Publishing: 15] [Article Influence: 2.5] [Reference Citation Analysis (0)] |
57. | Brown BH, Tidy JA. The diagnostic accuracy of colposcopy - A review of research methodology and impact on the outcomes of quality assurance. Eur J Obstet Gynecol Reprod Biol. 2019;240:182-186. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 16] [Cited by in F6Publishing: 33] [Article Influence: 6.6] [Reference Citation Analysis (0)] |
58. | Xue P, Ng MTA, Qiao Y. The challenges of colposcopy for cervical cancer screening in LMICs and solutions by artificial intelligence. BMC Med. 2020;18:169. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 37] [Cited by in F6Publishing: 66] [Article Influence: 16.5] [Reference Citation Analysis (0)] |
59. | Wolz R, Chu C, Misawa K, Fujiwara M, Mori K, Rueckert D. Automated abdominal multi-organ segmentation with subject-specific atlas generation. IEEE Trans Med Imaging. 2013;32:1723-1730. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 197] [Cited by in F6Publishing: 133] [Article Influence: 12.1] [Reference Citation Analysis (0)] |
60. | Summers RM, Elton DC, Lee S, Zhu Y, Liu J, Bagheri M, Sandfort V, Grayson PC, Mehta NN, Pinto PA, Linehan WM, Perez AA, Graffy PM, O'Connor SD, Pickhardt PJ. Atherosclerotic Plaque Burden on Abdominal CT: Automated Assessment With Deep Learning on Noncontrast and Contrast-enhanced Scans. Acad Radiol. 2020;. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 11] [Cited by in F6Publishing: 23] [Article Influence: 7.7] [Reference Citation Analysis (0)] |
61. | Wang G, Li W, Zuluaga MA, Pratt R, Patel PA, Aertsen M, Doel T, David AL, Deprest J, Ourselin S, Vercauteren T. Interactive Medical Image Segmentation Using Deep Learning With Image-Specific Fine Tuning. IEEE Trans Med Imaging. 2018;37:1562-1573. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 471] [Cited by in F6Publishing: 265] [Article Influence: 44.2] [Reference Citation Analysis (0)] |
62. | Boers TGW, Hu Y, Gibson E, Barratt DC, Bonmati E, Krdzalic J, van der Heijden F, Hermans JJ, Huisman HJ. Interactive 3D U-net for the segmentation of the pancreas in computed tomography scans. Phys Med Biol. 2020;65:065002. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 27] [Cited by in F6Publishing: 17] [Article Influence: 4.3] [Reference Citation Analysis (0)] |
63. | Halverson SJ, Kunju LP, Bhalla R, Gadzinski AJ, Alderman M, Miller DC, Montgomery JS, Weizer AZ, Wu A, Hafez KS, Wolf JS Jr. Accuracy of determining small renal mass management with risk stratified biopsies: confirmation by final pathology. J Urol. 2013;189:441-446. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 142] [Cited by in F6Publishing: 145] [Article Influence: 12.1] [Reference Citation Analysis (0)] |
64. | Dagher J, Delahunt B, Rioux-Leclercq N, Egevad L, Srigley JR, Coughlin G, Dunglinson N, Gianduzzo T, Kua B, Malone G, Martin B, Preston J, Pokorny M, Wood S, Yaxley J, Samaratunga H. Clear cell renal cell carcinoma: validation of World Health Organization/International Society of Urological Pathology grading. Histopathology. 2017;71:918-925. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 70] [Cited by in F6Publishing: 89] [Article Influence: 12.7] [Reference Citation Analysis (0)] |
65. | Delahunt B, Eble JN, Egevad L, Samaratunga H. Grading of renal cell carcinoma. Histopathology. 2019;74:4-17. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 75] [Cited by in F6Publishing: 120] [Article Influence: 24.0] [Reference Citation Analysis (0)] |
66. | Kim H, Inomoto C, Uchida T, Furuya H, Komiyama T, Kajiwara H, Kobayashi H, Nakamura N, Miyajima A. Verification of the International Society of Urological Pathology recommendations in Japanese patients with clear cell renal cell carcinoma. Int J Oncol. 2018;52:1139-1148. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 7] [Cited by in F6Publishing: 13] [Article Influence: 2.2] [Reference Citation Analysis (0)] |
67. | Zhao J, Zhang P, Chen X, Cao W, Ye Z. Lesion Size and Iodine Quantification to Distinguish Low-Grade From High-Grade Clear Cell Renal Cell Carcinoma Using Dual-Energy Spectral Computed Tomography. J Comput Assist Tomogr. 2016;40:673-677. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 9] [Cited by in F6Publishing: 12] [Article Influence: 1.7] [Reference Citation Analysis (0)] |
68. | Parada Villavicencio C, Mc Carthy RJ, Miller FH. Can diffusion-weighted magnetic resonance imaging of clear cell renal carcinoma predict low from high nuclear grade tumors. Abdom Radiol (NY). 2017;42:1241-1249. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 24] [Cited by in F6Publishing: 26] [Article Influence: 3.7] [Reference Citation Analysis (0)] |
69. | Aslan A, İnan İ, Aktan A, Ayaz E, Aslan M, Özkanlı SŞ, Yıldırım A, Yıkılmaz A. The utility of ADC measurement techniques for differentiation of low- and high-grade clear cell RCC. Pol J Radiol. 2018;83:e446-e451. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 8] [Cited by in F6Publishing: 8] [Article Influence: 1.3] [Reference Citation Analysis (0)] |
70. | Chen C, Kang Q, Xu B, Guo H, Wei Q, Wang T, Ye H, Wu X. Differentiation of low- and high-grade clear cell renal cell carcinoma: Tumor size versus CT perfusion parameters. Clin Imaging. 2017;46:14-19. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 14] [Cited by in F6Publishing: 16] [Article Influence: 2.3] [Reference Citation Analysis (0)] |
71. | Cui E, Li Z, Ma C, Li Q, Lei Y, Lan Y, Yu J, Zhou Z, Li R, Long W, Lin F. Predicting the ISUP grade of clear cell renal cell carcinoma with multiparametric MR and multiphase CT radiomics. Eur Radiol. 2020;30:2912-2921. [PubMed] [DOI] [Cited in This Article: ] [Cited by in Crossref: 31] [Cited by in F6Publishing: 54] [Article Influence: 13.5] [Reference Citation Analysis (0)] |