Within this work we explore texture analysis of optical coherence tomography images and machine learning for automated detect classification of breast biopsies. Under an approved IRB protocol, breast biopsy specimens from 100 patients were imaged with a high resolution OCT system providing 3.7 micron axial resolution. The texture features extracted were first order statistics (histogram distribution) and second order statistics (such as GLCM). Binary classification was carried out for two cases: 1) risk 0 (no risk of cancer) versus everything else and 2) risk 3 (cancer) versus everything else.
Significance: Real-time histology can close a variety of gaps in tissue diagnostics. Currently, gross pathology analysis of excised tissue is dependent upon visual inspection and palpation to identify regions of interest for histopathological processing. Such analysis is limited by the variable correlation between macroscopic and microscopic findings. The current standard of care is costly, burdensome, and inefficient.Aim: We are the first to address this gap by introducing optical coherence tomography (OCT) to be integrated in real-time during the pathology grossing process.Approach: This is achieved by our high-resolution, ultrahigh-speed, large field-of-view OCT device designed for this clinical application.Results: We demonstrate the feasibility of imaging tissue sections from multiple human organs (breast, prostate, lung, and pancreas) in a clinical gross pathology setting without interrupting standard workflows.Conclusions: OCT-based real-time histology evaluation holds promise for addressing a gap that has been present for >100 years.
Optical coherence tomography (OCT) is being studied to provide rapid biopsy evaluation. Here we developed a deep learning algorithm to rapidly identify disease in OCT images in an 87-patient IRB-approved clinical study. Pathologists labelled each biopsy into two categories: non-interest (no disease) and interest (for further pathological analysis). Our dataset was split by patients into training (n = 70) and validation (n = 17). The Resnet18 architecture used the Adam optimizer, had a learning rate of 0.01, batch size of 8, and ran for 30 epochs. The network achieved 97% training accuracy and 70% validation accuracy.
This Conference Presentation, Real-time histology evaluation by optical coherence tomography (OCT) holds promise to improve the diagnostic anatomic pathology gross evaluation process, was recorded at SPIE Photonics West held in San Francisco, California, United States.
The purpose of this study was to develop a generalizable convolutional neural network (CNN) classification technique of optical coherence tomography (OCT) images of breast tissue acquired from multiple OCT systems. We imaged lumpectomy and mastectomy specimens (acquired through the Columbia University Tissue Bank) from 31 patients. In our early results, we classified the images into healthy tissue (adipose and stroma) and diseased, which included ductal carcinoma in situ (DCIS), mucinous carcinoma, and invasive ductal carcinoma (IDC). Our goal is to expand our classification to differentiate the diseased tissue into subclasses of DCIS, IDC, mucinous carcinoma, and benign tissue.
The purpose of this study was to develop an ultrahigh-speed, high-resolution, large imaging area optical coherence tomography (OCT) imaging system for improved evaluation of breast cancer surgical specimens. A 2-axis direct-drive servo motorized linear translation stage enables automatic acquisition and stitching of multiple OCT volumes, providing large-area imaging of tissue sizes up to 10cm x 10cm surface area x 2mm imaging depth. OCT images of breast specimens with this system will demonstrate normal breast parenchyma, cysts, radial scars, benign neoplasms such as fibroadenoma, atypical ductal/lobular hyperplasia, and carcinomas in situ and invasive.
The purpose of this study was to develop and evaluate the performance of a convolutional neural network (CNN) that uses a novel A-line based classification approach to detect cancer in OCT images of breast specimens. Deep learning algorithms have been developed for OCT ophthalmology applications using pixel-based classification approaches. In this study, a novel deep learning approach was developed that classifies OCT A-lines of breast tissue. De-identified human breast tissues from mastectomy and breast reduction specimens were excised from patients at Columbia University Medical Center. A total of 82 specimens from 49 patients were imaged with OCT, including both normal tissues and non-neoplastic tissues. The proposed algorithm utilized a hybrid 2D/1D convolutional neural network (CNN) to map each single B-scan to a 1D label vector, which were derived from manual annotation. Each A-line was labelled as one of the following tissue types: ductal carcinoma in situ (DCIS), invasive ductal carcinoma (IDC), adipose, and stroma. Five-fold cross-validation Dice scores across tissue types were: 0.82-0.95 for IDC, 0.54-0.75 for DCIS, 0.67-0.91 for adipose, and 0.61-0.86 for stroma. In a second experiment, IDC and DCIS were combined as a single tissue class (malignancy) while stroma and adipose were combined as a second tissue class (non-malignancy). In this setup, the experiment yielded five-fold cross-validation Dice scores between 0.89-0.93, respectively. Future work includes acquiring more patient samples and to compare the algorithm to previous works, including both deep learning and traditional automatic image processing methods for classification of breast tissue in OCT images.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.