Automatic breast ultrasound (ABUS) imaging provides complementary information when other imaging modalities (i.e., mammography) are not conclusive in the task of tumor detection. It enables sectional plane visualization, simplified temporal comparison and coronal depiction, and features higher reproducibility than conventional ultrasound imaging. Although the 3D ABUS acquisition significantly reduces the acquisition time and cost, the manual segmentation of tumor on 3D ABUS could be time-consuming and labor-intensive due to its high slice number. This work aims to develop a deep-learning-based method to automatically segment the breast tumor on 3D ABUS. We integrated mask scoring-based self-correlation strategy into the R-CNN-based method to force the final segmented tumor contour to be more reasonable. We tested the performance of the proposed method using 3D ABUS of 40 patients who are confirmed with breast tumor through four-fold cross validation test. The comparison between our results and the ground truth contours was quantified by metrics including the Dice similarity coefficient (DSC), Jaccard index (JAC), 95% Hausdorff distance (HD95). The mean DSC, JAC, and HD95 were 0.85 ± 0.10, 0.75 ± 0.14, and 1.65 ± 1.20 (mm), respectively.
Automated 3D breast ultrasound (ABUS) has substantial potential in breast imaging. ABUS appears to be beneficial because of its outstanding reproducibility and reliability, especially for screening women with dense breasts. However, due to the high number of slices in 3D ABUS, it requires lengthy screening time for radiologists, and they may miss small and subtle lesions. In this work, we propose to use a 3D Mask R-CNN method to automatically detect the location of the tumor and simultaneously segment the tumor contour. The performance of the proposed algorithm was evaluated using 25 patients’ data with ABUS image and ground truth contours. To further access the performance of the proposed method, we quantified the intersection over union (IoU), Dice similarity coefficient (DSC), and center of mass distance (CMD) between the ground truth and segmentation. The resultant IoU 96% ± 2%, DSC 84% ± 3%, and CMD 1.95 ± 0.89 mm respectively, which demonstrated the high accuracy of tumor detection and 3D volume segmentation of the proposed Mask R-CNN method. We have developed a novel deep learning-based method and demonstrated its capability of being used as a useful tool for computer-aided diagnosis and treatment.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.