KEYWORDS: Image segmentation, Breast, 3D image processing, 3D imaging standards, Magnetic resonance imaging, Education and training, Cross validation, 3D modeling, 3D image enhancement, Artificial intelligence
PurposeGiven the dependence of radiomic-based computer-aided diagnosis artificial intelligence on accurate lesion segmentation, we assessed the performances of 2D and 3D U-Nets in breast lesion segmentation on dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) relative to fuzzy c-means (FCM) and radiologist segmentations.ApproachUsing 994 unique breast lesions imaged with DCE-MRI, three segmentation algorithms (FCM clustering, 2D and 3D U-Net convolutional neural networks) were investigated. Center slice segmentations produced by FCM, 2D U-Net, and 3D U-Net were evaluated using radiologist segmentations as truth, and volumetric segmentations produced by 2D U-Net slices and 3D U-Net were compared using FCM as a surrogate reference standard. Fivefold cross-validation by lesion was conducted on the U-Nets; Dice similarity coefficient (DSC) and Hausdorff distance (HD) served as performance metrics. Segmentation performances were compared across different input image and lesion types.Results2D U-Net outperformed 3D U-Net for center slice (DSC, HD p < 0.001) and volume segmentations (DSC, HD p < 0.001). 2D U-Net outperformed FCM in center slice segmentation (DSC p < 0.001). The use of second postcontrast subtraction images showed greater performance than first postcontrast subtraction images using the 2D and 3D U-Net (DSC p < 0.05). Additionally, mass segmentation outperformed nonmass segmentation from first and second postcontrast subtraction images using 2D and 3D U-Nets (DSC, HD p < 0.001).ConclusionsResults suggest that 2D U-Net is promising in segmenting mass and nonmass enhancing breast lesions from first and second postcontrast subtraction MRIs and thus could be an effective alternative to FCM or 3D U-Net.
Computer-aided diagnosis based on features extracted from medical images relies heavily on accurate lesion segmentation before feature extraction. Using 994 unique breast lesions imaged with dynamic contrast-enhanced (DCE) MRI, several segmentation algorithms were investigated. The first method is fuzzy c-means (FCM), a well-established unsupervised clustering algorithm used on breast MRIs. The second and third methods are based on the convolutional neural network U-Net, a widely-used deep learning method for image segmentation—for two- or three-dimensional MRI data, respectively. The purpose of this study was twofold—1) to assess the performances of 2D (slice-by-slice) and 3D U-Nets in breast lesion segmentation on DCE-MRI trained with FCM segmentations, and 2) compare their performance to that of FCM. Center slice segmentations produced by FCM, 2D U-Net, and 3D U-Net were evaluated using radiologist segmentations as truth, and volumetric segmentations produced by 2D U-Net (slice-by-slice) and 3D U-Net were compared using FCM as a surrogate truth. Five-fold cross-validation was conducted on the U-Nets and Dice similarity coefficient (DSC) and Hausdorff distance (HD) were used as performance metrics. Although 3D U-Net performed well, 2D U-Net outperformed 3D U-Net, both for center slice (DSC p=4.13×10-9, HD p=1.40×10-2) and volume segmentations (DSC p=2.72×10-83, HD p=2.28×10-10). Additionally, 2D U-Net outperformed FCM in center slice segmentation in terms of DSC (p=1.09×10-7). The results suggest that 2D U-Net is promising in segmenting breast lesions and could be an effective alternative to FCM.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.