1
|
Pandey SK, Rathore YK, Ojha MK, Janghel RR, Sinha A, Kumar A. BCCHI-HCNN: Breast Cancer Classification from Histopathological Images Using Hybrid Deep CNN Models. JOURNAL OF IMAGING INFORMATICS IN MEDICINE 2025; 38:1690-1703. [PMID: 39402357 DOI: 10.1007/s10278-024-01297-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/13/2024] [Revised: 09/30/2024] [Accepted: 10/04/2024] [Indexed: 05/22/2025]
Abstract
Breast cancer is the most common cancer in women globally, imposing a significant burden on global public health due to high death rates. Data from the World Health Organization show an alarming annual incidence of nearly 2.3 million new cases, drawing the attention of patients, healthcare professionals, and governments alike. Through the examination of histopathological pictures, this study aims to revolutionize the early and precise identification of breast cancer by utilizing the capabilities of a deep convolutional neural network (CNN)-based model. The model's performance is improved by including numerous classifiers, including support vector machine (SVM), decision tree, and K-nearest neighbors (KNN), using transfer learning techniques. The studies include evaluating two separate feature vectors, one with and one without principal component analysis (PCA). Extensive comparisons are made to measure the model's performance against current deep learning models, including critical metrics such as false positive rate, true positive rate, accuracy, precision, and recall. The data show that the SVM algorithm with PCA features achieves excellent speed and accuracy, with an amazing accuracy of 99.5%. Furthermore, although being somewhat slower than SVM, the decision tree model has the greatest accuracy of 99.4% without PCA. This study suggests a viable strategy for improving early breast cancer diagnosis, opening the path for more effective healthcare treatments and better patient outcomes.
Collapse
Affiliation(s)
- Saroj Kumar Pandey
- Department of Computer Engineering & Applications, GLA University, Mathura, India.
| | - Yogesh Kumar Rathore
- Department of Computer Science & Engineering, Shri Shankaracharya Institute of Professional Management and Technology, Raipur, India
| | | | - Rekh Ram Janghel
- Department of Information Technology, National Institute of Technology, Raipur, India
| | - Anurag Sinha
- ICFAI Tech School, Computer Science Department, ICFAI University, Ranchi, Jharkhand, India
| | - Ankit Kumar
- Department of Information Technology, GGV, Bilaspur, CG, India
| |
Collapse
|
2
|
Chowa SS, Azam S, Montaha S, Payel IJ, Bhuiyan MRI, Hasan MZ, Jonkman M. Graph neural network-based breast cancer diagnosis using ultrasound images with optimized graph construction integrating the medically significant features. J Cancer Res Clin Oncol 2023; 149:18039-18064. [PMID: 37982829 PMCID: PMC10725367 DOI: 10.1007/s00432-023-05464-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2023] [Accepted: 10/06/2023] [Indexed: 11/21/2023]
Abstract
PURPOSE An automated computerized approach can aid radiologists in the early diagnosis of breast cancer. In this study, a novel method is proposed for classifying breast tumors into benign and malignant, based on the ultrasound images through a Graph Neural Network (GNN) model utilizing clinically significant features. METHOD Ten informative features are extracted from the region of interest (ROI), based on the radiologists' diagnosis markers. The significance of the features is evaluated using density plot and T test statistical analysis method. A feature table is generated where each row represents individual image, considered as node, and the edges between the nodes are denoted by calculating the Spearman correlation coefficient. A graph dataset is generated and fed into the GNN model. The model is configured through ablation study and Bayesian optimization. The optimized model is then evaluated with different correlation thresholds for getting the highest performance with a shallow graph. The performance consistency is validated with k-fold cross validation. The impact of utilizing ROIs and handcrafted features for breast tumor classification is evaluated by comparing the model's performance with Histogram of Oriented Gradients (HOG) descriptor features from the entire ultrasound image. Lastly, a clustering-based analysis is performed to generate a new filtered graph, considering weak and strong relationships of the nodes, based on the similarities. RESULTS The results indicate that with a threshold value of 0.95, the GNN model achieves the highest test accuracy of 99.48%, precision and recall of 100%, and F1 score of 99.28%, reducing the number of edges by 85.5%. The GNN model's performance is 86.91%, considering no threshold value for the graph generated from HOG descriptor features. Different threshold values for the Spearman's correlation score are experimented with and the performance is compared. No significant differences are observed between the previous graph and the filtered graph. CONCLUSION The proposed approach might aid the radiologists in effective diagnosing and learning tumor pattern of breast cancer.
Collapse
Affiliation(s)
- Sadia Sultana Chowa
- Faculty of Science and Technology, Charles Darwin University, Casuarina, NT, 0909, Australia
| | - Sami Azam
- Faculty of Science and Technology, Charles Darwin University, Casuarina, NT, 0909, Australia.
| | - Sidratul Montaha
- Faculty of Science and Technology, Charles Darwin University, Casuarina, NT, 0909, Australia
| | - Israt Jahan Payel
- Health Informatics Research Laboratory (HIRL), Department of Computer Science and Engineering, Daffodil International University, Dhaka, 1216, Bangladesh
| | - Md Rahad Islam Bhuiyan
- Faculty of Science and Technology, Charles Darwin University, Casuarina, NT, 0909, Australia
| | - Md Zahid Hasan
- Health Informatics Research Laboratory (HIRL), Department of Computer Science and Engineering, Daffodil International University, Dhaka, 1216, Bangladesh
| | - Mirjam Jonkman
- Faculty of Science and Technology, Charles Darwin University, Casuarina, NT, 0909, Australia
| |
Collapse
|
3
|
Emerging Feature Extraction Techniques for Machine Learning-Based Classification of Carotid Artery Ultrasound Images. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE 2022; 2022:1847981. [PMID: 35602622 PMCID: PMC9119795 DOI: 10.1155/2022/1847981] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/21/2022] [Revised: 04/07/2022] [Accepted: 04/09/2022] [Indexed: 11/24/2022]
Abstract
Plaque deposits in the carotid artery are the major cause of stroke and atherosclerosis. Ultrasound imaging is used as an early indicator of disease progression. Classification of the images to identify plaque presence and intima-media thickness (IMT) by machine learning algorithms requires features extracted from the images. A total of 361 images were used for feature extraction, which will assist in further classification of the carotid artery. This study presents the extraction of 65 features, which constitute of shape, texture, histogram, correlogram, and morphology features. Principal component analysis (PCA)-based feature selection is performed, and the 22 most significant features, which will improve the classification accuracy, are selected. Naive Bayes algorithm and dynamic learning vector quantization (DLVQ)-based machine learning classifications are performed with the extracted and selected features, and analysis is performed.
Collapse
|
4
|
Lyu SY, Zhang Y, Zhang MW, Zhang BS, Gao LB, Bai LT, Wang J. Diagnostic value of artificial intelligence automatic detection systems for breast BI-RADS 4 nodules. World J Clin Cases 2022; 10:518-527. [PMID: 35097077 PMCID: PMC8771370 DOI: 10.12998/wjcc.v10.i2.518] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/14/2021] [Revised: 10/22/2021] [Accepted: 11/29/2021] [Indexed: 02/06/2023] Open
Abstract
BACKGROUND The incidence rate of breast cancer has exceeded that of lung cancer, and it has become the most malignant type of cancer in the world. BI-RADS 4 breast nodules have a wide range of malignant risks and are associated with challenging clinical decision-making.
AIM To explore the diagnostic value of artificial intelligence (AI) automatic detection systems for BI-RADS 4 breast nodules and to assess whether conventional ultrasound BI-RADS classification with AI automatic detection systems can reduce the probability of BI-RADS 4 biopsy.
METHODS A total of 107 BI-RADS breast nodules confirmed by pathology were selected between June 2019 and July 2020 at Hwa Mei Hospital, University of Chinese Academy of Sciences. These nodules were classified by ultrasound doctors and the AI-SONIC breast system. The diagnostic values of conventional ultrasound, the AI automatic detection system, conventional ultrasound combined with the AI automatic detection system and adjusted BI-RADS classification diagnosis were statistically analyzed.
RESULTS Among the 107 breast nodules, 61 were benign (57.01%), and 46 were malignant (42.99%). The pathology results were considered the gold standard; furthermore, the sensitivity, specificity, accuracy, Youden index, and positive and negative predictive values were 84.78%, 67.21%, 74.77%, 0.5199, 66.10% and 85.42% for conventional ultrasound BI-RADS classification diagnosis, 86.96%, 75.41%, 80.37%, 0.6237, 72.73%, and 88.46% for automatic AI detection, 80.43%, 90.16%, 85.98%, 0.7059, 86.05%, and 85.94% for conventional ultrasound BI-RADS classification with automatic AI detection and 93.48%, 67.21%, 78.50%, 0.6069, 68.25%, and 93.18% for adjusted BI-RADS classification, respectively. The biopsy rate, cancer detection rate and malignancy risk were 100%, 42.99% and 0% and 67.29%, 61.11%, and 1.87% before and after BI-RADS adjustment, respectively.
CONCLUSION Automatic AI detection has high accuracy in determining benign and malignant BI-RADS 4 breast nodules. Conventional ultrasound BI-RADS classification combined with AI automatic detection can reduce the biopsy rate of BI-RADS 4 breast nodules.
Collapse
Affiliation(s)
- Shu-Yi Lyu
- Interventional Therapy Department, Hwa Mei Hospital, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
- Interventional Therapy Department, Ningbo Institute of Life and Health Industry, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
| | - Yan Zhang
- Interventional Therapy Department, Hwa Mei Hospital, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
- Interventional Therapy Department, Ningbo Institute of Life and Health Industry, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
| | - Mei-Wu Zhang
- Interventional Therapy Department, Hwa Mei Hospital, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
- Interventional Therapy Department, Ningbo Institute of Life and Health Industry, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
| | - Bai-Song Zhang
- Interventional Therapy Department, Hwa Mei Hospital, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
- Interventional Therapy Department, Ningbo Institute of Life and Health Industry, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
| | - Li-Bo Gao
- Interventional Therapy Department, Hwa Mei Hospital, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
- Interventional Therapy Department, Ningbo Institute of Life and Health Industry, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
| | - Lang-Tao Bai
- Interventional Therapy Department, Hwa Mei Hospital, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
- Interventional Therapy Department, Ningbo Institute of Life and Health Industry, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
| | - Jue Wang
- Ultrasonography Department, Hwa Mei Hospital, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
- Ultrasonography Department, Ningbo Institute of Life and Health Industry, University of Chinese Academy of Sciences, Ningbo 315010, Zhejiang Province, China
| |
Collapse
|
5
|
Assari Z, Mahloojifar A, Ahmadinejad N. A bimodal BI-RADS-guided GoogLeNet-based CAD system for solid breast masses discrimination using transfer learning. Comput Biol Med 2021; 142:105160. [PMID: 34995955 DOI: 10.1016/j.compbiomed.2021.105160] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2021] [Revised: 12/14/2021] [Accepted: 12/18/2021] [Indexed: 12/14/2022]
Abstract
Numerous solid breast masses require sophisticated analysis to establish a differential diagnosis. Consequently, complementary modalities such as ultrasound imaging are frequently required to evaluate mammographically further detected masses. Radiologists mentally integrate complementary information from images acquired of the same patient to make a more conclusive and effective diagnosis. However, it has always been a challenging task. This paper details a novel bimodal GoogLeNet-based CAD system that addresses the challenges associated with combining information from mammographic and sonographic images for solid breast mass classification. Each modality is initially trained using two distinct monomodal models in the proposed framework. Then, using the high-level feature maps extracted from both modalities, a bimodal model is trained. In order to fully exploit the BI-RADS descriptors, different image content representations of each mass are obtained and used as input images. In addition, using an ImageNet pre-trained GoogLeNet model, two publicly available databases, and our collected dataset, a two-step transfer learning strategy has been proposed. Our bimodal model achieves the best recognition results in terms of sensitivity, specificity, F1-score, Matthews Correlation Coefficient, area under the receiver operating characteristic curve, and accuracy metrics of 90.91%, 89.87%, 90.32%, 80.78%, 95.82%, and 90.38%, respectively. The promising results indicate that the proposed CAD system can facilitate bimodal suspicious mass analysis and thus contribute significantly to improving breast cancer diagnostic performance.
Collapse
Affiliation(s)
- Zahra Assari
- Department of Biomedical Engineering, Faculty of Electrical and Computer Engineering, Tarbiat Modares University, Tehran, Iran
| | - Ali Mahloojifar
- Department of Biomedical Engineering, Faculty of Electrical and Computer Engineering, Tarbiat Modares University, Tehran, Iran.
| | - Nasrin Ahmadinejad
- Medical Imaging Center, Cancer Research Institute, Imam Khomeini Hospital Advanced Diagnostic and Interventional Radiology Research Center (ADIR), Tehran University of Medical Sciences (TUMS), Tehran, Iran
| |
Collapse
|
6
|
Qiao M, Hu Y, Guo Y, Wang Y, Yu J. Breast Tumor Classification Based on a Computerized Breast Imaging Reporting and Data System Feature System. JOURNAL OF ULTRASOUND IN MEDICINE : OFFICIAL JOURNAL OF THE AMERICAN INSTITUTE OF ULTRASOUND IN MEDICINE 2018; 37:403-415. [PMID: 28804937 DOI: 10.1002/jum.14350] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/08/2016] [Accepted: 05/11/2017] [Indexed: 06/07/2023]
Abstract
OBJECTIVES This work focused on extracting novel and validated digital high-throughput features to present a detailed and comprehensive description of the American College of Radiology Breast Imaging Reporting and Data System (BI-RADS) with the goal of improving the accuracy of ultrasound breast cancer diagnosis. METHODS First, the phase congruency approach was used to segment the tumors automatically. Second, high-throughput features were designed and extracted on the basis of each BI-RADS category. Then features were selected based on the basis of a Student t test and genetic algorithm. Finally, the AdaBoost classifier was used to differentiate benign tumors from malignant ones. RESULTS Experiments were conducted on a database of 138 pathologically proven breast tumors. The system was compared with 6 state-of-art BI-RADS feature extraction methods. By using leave-one-out cross-validation, our system achieved a highest overall accuracy of 93.48%, a sensitivity of 94.20%, a specificity of 92.75%, and an area under the receiver operating characteristic curve of 95.67%, respectively, which were superior to those of other methods. CONCLUSIONS The experiments demonstrated that our computerized BI-RADS feature system was capable of helping radiologists detect breast cancers more accurately and provided more guidance for final decisions.
Collapse
Affiliation(s)
- Mengyun Qiao
- Department of Electronic Engineering, Fudan University, Shanghai, China
| | - Yuzhou Hu
- Department of Electronic Engineering, Fudan University, Shanghai, China
| | - Yi Guo
- Department of Electronic Engineering, Fudan University, Shanghai, China
| | - Yuanyuan Wang
- Department of Electronic Engineering, Fudan University, Shanghai, China
| | - Jinhua Yu
- Department of Electronic Engineering, Fudan University, Shanghai, China
| |
Collapse
|
7
|
Rodríguez-Cristerna A, Gómez-Flores W, de Albuquerque Pereira WC. A computer-aided diagnosis system for breast ultrasound based on weighted BI-RADS classes. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2018; 153:33-40. [PMID: 29157459 DOI: 10.1016/j.cmpb.2017.10.004] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/05/2017] [Revised: 08/23/2017] [Accepted: 10/02/2017] [Indexed: 06/07/2023]
Abstract
BACKGROUND AND OBJECTIVE Conventional computer-aided diagnosis (CAD) systems for breast ultrasound (BUS) are trained to classify pathological classes, that is, benign and malignant. However, from a clinical perspective, this kind of classification does not agree totally with radiologists' diagnoses. Usually, the tumors are assessed by using a BI-RADS (Breast Imaging-Reporting and Data System) category and, accordingly, a recommendation is emitted: annual study for category 2 (benign), six-month follow-up study for category 3 (probably benign), and biopsy for categories 4 and 5 (suspicious of malignancy). Hence, in this paper, a CAD system based on BI-RADS categories weighted by pathological information is presented. The goal is to increase the classification performance by reducing the common class imbalance found in pathological classes as well as to provide outcomes quite similar to radiologists' recommendations. METHODS The BUS dataset considers 781 benign lesions and 347 malignant tumors proven by biopsy. Moreover, every lesion is associated to one BI-RADS category in the set {2, 3, 4, 5}. Thus, the dataset is split into three weighted classes: benign, BI-RADS 2 in benign lesions; probably benign, BI-RADS 3 and 4 in benign lesions; and malignant, BI-RADS 4 and 5 in malignant lesions. Thereafter, a random forest (RF) classifier, denoted by RFw, is trained to predict the weighted BI-RADS classes. In addition, for comparison purposes, a RF classifier is trained to predict pathological classes, denoted as RFp. RESULTS The ability of the classifiers to predict the pathological classes is measured by the area under the ROC curve (AUC), sensitivity (SEN), and specificity (SPE). The RFw classifier obtained AUC=0.872,SEN=0.826, and SPE=0.919, whereas the RFp classifier reached AUC=0.868,SEN=0.808, and SPE=0.929. According to a one-way analysis of variance test, the RFw classifier statistically outperforms (p < 0.001) the RFp classifier in terms of the AUC and SEN. Moreover, the classification performance of RFw to predict weighted BI-RADS classes is given by the Matthews correlation coefficient that obtained 0.614. CONCLUSIONS The division of the classification problem into three classes reduces the imbalance between benign and malignant classes; thus, the sensitivity is increased without degrading the specificity. Therefore, the CAD based on weighted BI-RADS classes improves the classification performance of the conventional CAD systems. Additionally, the proposed approach has the advantage of being capable of providing a multiclass outcome related to radiologists' recommendations.
Collapse
Affiliation(s)
- Arturo Rodríguez-Cristerna
- Center for Research and Advanced Studies of the National Polytechnic Institute, ZIP 87130, Ciudad Victoria, Tamaulipas, Mexico
| | - Wilfrido Gómez-Flores
- Center for Research and Advanced Studies of the National Polytechnic Institute, ZIP 87130, Ciudad Victoria, Tamaulipas, Mexico.
| | | |
Collapse
|
8
|
Casti P, Mencattini A, Sammarco I, Velappa SJ, Magna G, Cricenti A, Luce M, Pietroiusti A, Lesci GI, Ferrucci L, Magrini A, Martinelli E, Di Natale C. Robust classification of biological samples in atomic force microscopy images via multiple filtering cooperation. Knowl Based Syst 2017. [DOI: 10.1016/j.knosys.2017.07.016] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
|
9
|
Choi JH, Kang BJ, Baek JE, Lee HS, Kim SH. Application of computer-aided diagnosis in breast ultrasound interpretation: improvements in diagnostic performance according to reader experience. Ultrasonography 2017; 37:217-225. [PMID: 28992680 PMCID: PMC6044219 DOI: 10.14366/usg.17046] [Citation(s) in RCA: 53] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2017] [Accepted: 08/14/2017] [Indexed: 02/04/2023] Open
Abstract
Purpose The purpose of this study was to evaluate the usefulness of applying computer-aided diagnosis (CAD) to breast ultrasound (US), depending on the reader's experience with breast imaging. Methods Between October 2015 and January 2016, two experienced readers obtained and analyzed the grayscale US images of 200 cases according to the Breast Imaging Reporting and Data System (BI-RADS) lexicon and categories. They additionally applied CAD (S-Detect) to analyze the lesions and made a diagnostic decision subjectively, based on grayscale US with CAD. For the same cases, two inexperienced readers analyzed the grayscale US images using the BI-RADS lexicon and categories, added CAD, and came to a subjective diagnostic conclusion. We then compared the diagnostic performance depending on the reader's experience with breast imaging. Results The sensitivity values for the experienced readers, inexperienced readers, and CAD (for experienced and inexperienced readers) were 91.7%, 75.0%, 75.0%, and 66.7%, respectively. The specificity values for the experienced readers, inexperienced readers, and CAD (for experienced and inexperienced readers) were 76.6%, 71.8%, 78.2%, and 76.1%, respectively. When diagnoses were made subjectively in combination with CAD, the specificity significantly improved (76.6% to 80.3%) without a change in the sensitivity (91.7%) in the experienced readers. After subjective combination with CAD, both of the sensitivity and specificity improved in the inexperienced readers (75.0% to 83.3% and 71.8% to 77.1%). In addition, the area under the curve improved for both the experienced and inexperienced readers (0.84 to 0.86 and 0.73 to 0.80) after the addition of CAD. Conclusion CAD is more useful for less experienced readers. Combining CAD with breast US led to improved specificity for both experienced and inexperienced readers.
Collapse
Affiliation(s)
- Ji-Hye Choi
- Department of Radiology, Bucheon St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Bucheon, Korea
| | - Bong Joo Kang
- Department of Radiology, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Ji Eun Baek
- Department of Radiology, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Hyun Sil Lee
- Department of Radiology, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| | - Sung Hun Kim
- Department of Radiology, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Korea
| |
Collapse
|
10
|
Hu Y, Qiao M, Guo Y, Wang Y, Yu J, Li J, Chang C. Reproducibility of quantitative high-throughput BI-RADS features extracted from ultrasound images of breast cancer. Med Phys 2017; 44:3676-3685. [DOI: 10.1002/mp.12275] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2016] [Revised: 03/14/2017] [Accepted: 04/03/2017] [Indexed: 11/09/2022] Open
Affiliation(s)
- Yuzhou Hu
- Department of Electronic Engineering; Fudan University; Shanghai 200433 China
| | - Mengyun Qiao
- Department of Electronic Engineering; Fudan University; Shanghai 200433 China
| | - Yi Guo
- Department of Electronic Engineering; Fudan University and Key Laboratory of Medical Imaging Computing and Computer Assisted Intervention of Shanghai; Shanghai 200433 China
| | - Yuanyuan Wang
- Department of Electronic Engineering; Fudan University and Key Laboratory of Medical Imaging Computing and Computer Assisted Intervention of Shanghai; Shanghai 200433 China
| | - Jinhua Yu
- Department of Electronic Engineering; Fudan University and Key Laboratory of Medical Imaging Computing and Computer Assisted Intervention of Shanghai; Shanghai 200433 China
| | - Jiawei Li
- Department of Ultrasound; Fudan University Shanghai Cancer Center; Shanghai 200032 China
| | - Cai Chang
- Department of Ultrasound; Fudan University Shanghai Cancer Center; Shanghai 200032 China
| |
Collapse
|