Share Email Print
cover

Proceedings Paper • new

Evaluating deep learning techniques for dynamic contrast-enhanced MRI in the diagnosis of breast cancer
Format Member Price Non-Member Price
PDF $17.00 $21.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Deep learning has shown promise in the field of computer vision for image recognition. We evaluated two deep transfer learning techniques (feature extraction and fine-tuning) in the diagnosis of breast cancer compared to a lesion-based radiomics computer-aided diagnosis (CAD) method. The dataset included a total of 2006 breast lesions (1506 malignant and 500 benign) that were imaged with dynamic contrast-enhanced MRI. Pre-contrast, first post-contrast, and second post-contrast timepoint images for each lesion were combined to form an RGB image, which subsequently served as input to a VGG19 convolutional neural network (CNN) pre-trained on the ImageNet database. The first transfer learning technique was feature extraction conducted by extracting feature output from each of the five max-pooling layers in the trained CNN, average-pooling the features, performing feature reduction, and merging the CNN-features with a support vector machine in the classification of malignant and benign lesions. The second transfer learning method used a 64% training, 16% validation, and 20% testing dataset split in the fine-tuning of the final fully connected layers of the pretrained VGG19 to classify the images as malignant or benign. The performance of each of the three CAD methods were evaluated using receiver operating characteristic (ROC) analysis with area under the ROC curve (AUC) as the performance metric in the task of distinguishing between malignant and benign lesions. The performance of the radiomics CAD (AUC = 0.90) was significantly better than that of the CNN-feature-extraction (AUC = 0.84; p<0.0001), however, we failed to show a significant difference with the fine-tuning method (AUC = 0.86; p=0.1251), and thus, we conclude that transfer learning shows potential as a comparable computer-aided diagnosis technique.

Paper Details

Date Published: 13 March 2019
PDF: 7 pages
Proc. SPIE 10950, Medical Imaging 2019: Computer-Aided Diagnosis, 1095006 (13 March 2019); doi: 10.1117/12.2512667
Show Author Affiliations
Rachel Anderson, The Univ. of Chicago (United States)
Hui Li, The Univ. of Chicago (United States)
Yu Ji, The Univ. of Chicago (United States)
Tianjin Medical Univ. Cancer Institute & Hospital (China)
Peifang Liu, Tianjin Medical Univ. Cancer Institute & Hospital (China)
Maryellen L. Giger, The Univ. of Chicago (United States)


Published in SPIE Proceedings Vol. 10950:
Medical Imaging 2019: Computer-Aided Diagnosis
Kensaku Mori; Horst K. Hahn, Editor(s)

© SPIE. Terms of Use
Back to Top