High dose-rate brachytherapy is a typical part of the treatment process for cervical cancer. During this procedure, radioactive sources are placed locally to the malignancy using specialized applicators or interstitial needles. To ensure accurate dose delivery and positive patient outcomes, medical imaging is utilized intra-procedurally to ensure precise placement of the applicator. Previously, the fusion of three-dimensional ultrasound images has been investigated as an alternative volumetric imaging technique during cervical brachytherapy treatments. However, the need to manually register the two three-dimensional ultrasound images offline resulted in excessively large registration errors. To overcome this limitation, we have designed and developed a tracked, automated mechatronic system to inherently register three-dimensional ultrasound images in real-time. We perform a system calibration using an external coordinate system transform and validate the system tracking using a commercial optical tracker. The results of both experiments indicated sub-millimeter system accuracy, indicating the superior performance of our device. Future work for this study includes performing phantom validation experiments and translating our device into clinical work.
KEYWORDS: Visualization, Open source software, 3D applications, Statistical analysis, Analytical research, Tissues, Data analysis, Ions, Biological research, Principal component analysis
Mass Spectrometry Imaging (MSI) is a powerful tool capable of visualizing molecular patterns to identify disease markers in tissue analysis. However, data analysis is computationally heavy and currently time-consuming as there is no single platform capable of performing the entire preprocessing, visualization, and analysis pipeline end-to-end. Using different software tools and file formats required for such tools also makes the process prone to error. The purpose of this work is to develop a free, open-source software implementation called “Visualization, Preprocessing, and Registration Environment” (ViPRE), capable of end-to-end analysis of MSI data. ViPRE was developed to provide various functionalities required for MSI analysis including data import, data visualization, data registration, Region of Interest (ROI) selection, spectral data alignment and data analysis. The software implementation is offered as an open-source module in 3D Slicer, a medical imaging platform. It is also designed for flexibility and usability throughout the user experience. ViPRE was tested using sample MSI data to evaluate the computational pipeline, with the results showing successful implementation of its functionalities and end-to-end usage. A preliminary usability test was also performed to assess user experience, with findings showing positive results. ViPRE aspires to satisfy the need for a single-stop comprehensive interface for MSI data analysis. The source code and documentation will be made publicly available.
Treatment for Basal Cell Carcinoma (BCC) includes an excisional surgery to remove cancerous tissues, using a cautery tool to make burns along a defined resection margin around the tumor. Margin evaluation occurs post-surgically, requiring repeat surgery if positive margins are detected. Rapid Evaporative Ionization Mass Spectrometry (REIMS) can help distinguish healthy and cancerous tissue but does not provide spatial information about the cautery tool location where the spectra are acquired. We propose using intraoperative surgical video recordings and deep learning to provide surgeons with guidance to locate sites of potential positive margins. Frames from 14 intraoperative videos of BCC surgery were extracted and used to train a sequence of networks. The first network extracts frames showing surgery in-progress, then, an object detection network localizes the cautery tool and resection margin. Finally, our burn prediction model leverages the effectiveness of both a Long Short-Term Memory (LSTM) network and a Receiver Operating Characteristic (ROC) curve to accurately predict when the surgeon is cutting. The cut identifications will be used in the future for synchronization with iKnife data to provide localizations when cuts are predicted. The model was trained with four-fold cross-validation on a patient-wise split between training, validation, and testing sets. Average recall over the four folds of testing for the LSTM and ROC were 0.80 and 0.73, respectively. The video-based approach is simple yet effective at identifying tool-to-skin contact instances and may help guide surgeons, enabling them to deliver precise treatments in combination with iKnife data.
High-dose-rate brachytherapy is an accepted standard-of-care treatment for prostate cancer. In this procedure, catheters are inserted using three-dimensional (3D) transrectal ultrasound image-guidance. Their positions are manually segmented for treatment planning and delivery. The transverse ultrasound sweep, which is subject to tip and depth error for catheter localization, is a commonly used ultrasound imaging option available for image acquisition. We propose a two-step pipeline that uses a deep-learning network and curve fitting to automatically localize and model catheters in transversely reconstructed 3D ultrasound images. In the first step, a 3D U-Net was trained to automatically segment all catheters in a 3D ultrasound image. Following this step, curve fitting was implemented to detect the shapes of individual catheters using polynomial fitting. Of the 343 catheters (from 20 patients) in the testing data, the pipeline detected 320 (93.29%) with 7 false positives (2.04%) and 13 false negatives (3.79%). The average distance± one standard deviation between the ground truth and predictions for each catheter shaft was 1.9 ± 0.3 mm. The average difference of each catheter tip was 3.0 ± 0.4 mm. The proposed pipeline provides a method for reducing time spent on verification of catheter positions, minimizing uncertainties, and improving clinical workflow during the procedure. Reducing human variability in catheter placement predictions may increase the accuracy of tracking and radiation dose modelling.
High dose rate brachytherapy is a common procedure used in the treatment of gynecological cancers to irradiate malignant tumors while sparing the surrounding healthy tissue. While treatment may be delivered using a variety of applicator types, a hybrid technique consisting of an intracavitary applicator and interstitial needles allows for highly localized placement of the radioactive sources. To ensure an accurate and precise procedure, identification of the applicator and needle tips is necessary. The use of three-dimensional (3D) transrectal ultrasound (TRUS) and transabdominal ultrasound (TAUS) imaging has been previously investigated for the visualization of the intracavitary applicators. However, due to image artifacts from the applicator, needle tip identification is severely restricted when using a single 3D US view. To overcome this limitation and improve treatment outcome, we propose the use of image fusion to combine TRUS and TAUS images for the complete visualization of the applicator, needle tips, and surrounding anatomy. In this proof-of-concept work, we use a multimodality anthropomorphic pelvic phantom to assess the feasibility of image fusion and needle visualization using a hybrid brachytherapy applicator. We found that fused 3D US images resulted in accurate visualization of the pertinent structures when compared with magnetic resonance images. The results of this study demonstrate the future potential of image fusion in gynecological brachytherapy applications to ensure high treatment quality and reduce radiation dose to surrounding healthy tissue. This work is currently being expanded to other applicator types and is being applied to patients in a clinical trial.
Surgical excision for basal cell carcinoma (BCC) is a common treatment to remove the affected areas of skin. Minimizing positive margins around excised tissue is essential for successful treatment. Residual cancer cells may result in repeat surgery; however, detecting remaining cancer can be challenging and time-consuming. Using chemical signal data acquired while tissue is excised with a cautery tool, the iKnife system can discriminate between healthy and cancerous tissue but lacks spatial information, making it difficult to navigate back to suspicious margins. Intraoperative videos of BCC excision allow cautery locations to be tracked, providing the sites of potential positive margins. We propose a deep learning approach using convolutional neural networks to recognize phases in the videos and subsequently track the cautery location, comparing two localization methods (supervised and semi-supervised). Phase recognition was used for preprocessing to classify frames as showing the surgery or the start/stop of iKnife data acquisition. Only frames designated as showing the surgery were used for cautery localization. Fourteen videos were recorded during BCC excisions with iKnife data collection. On unseen testing data (2 videos, 1,832 frames), the phase recognition model showed an overall accuracy of 86%. Tool localization performed with a mean average precision of 0.98 and 0.96 for supervised and semisupervised methods, respectively, at a 0.5 intersection over union threshold. Incorporating intraoperative phase data with tool tracking provides surgeons with spatial information about the cautery tool location around suspicious regions, potentially improving the surgeon's ability to navigate back to the area of concern.
Brachytherapy is often used in gynecologic cancer treatment to provide high radiation doses to tumors and spare nearby healthy tissues. Intracavitary applicators, including tandem-and-ovoid and tandem-and-ring, are commonly used to position the radioactive sources appropriately. Three-dimensional (3D) transrectal ultrasound (TRUS) imaging has been demonstrated to allow for consistent delineation of the clinical target volume; however, the ability to visualize applicators and relevant structures following applicator insertion has not been investigated. We propose the use of a 3D TRUS system to visualize applicators at the time of placement. In two patient images, the key components of the tandem-and-ovoid applicators were clearly visualized, as well as the uterus, cervix, and vagina, with the potential to identify the tumor and organs-at-risk in these images. Although the tandem-and-ring applicator (one patient) obscured the cervix and anterior anatomy, the posterior applicator edges were visualized and we propose combining the 3D TRUS image with a 3D transabdominal ultrasound (TAUS) image for more complete visualization of the necessary structures. We designed a multimodality application-specific pelvic phantom to assess the feasibility of the image fusion and performed preliminary feasibility assessment on a tandem-and-ovoids applicator and a tandem-and-ring applicator with an interstitial ring cap. The resulting phantom images showed promising features for future image fusion. Intraoperative assessment of applicator placement has the potential to improve the treatment quality and reduce the risk of complications from overexposure of nearby normal tissues, as well as provides a promising approach for accessible image-guided brachytherapy, facilitating broader adoption to healthcare cost-constrained settings.
High-dose-rate interstitial gynecologic brachytherapy requires multiple needles to be inserted into the tumor and surrounding area, avoiding nearby healthy organs-at-risk (OARs), including the bladder and rectum. We propose the use of a 360° three-dimensional (3D) transvaginal ultrasound (TVUS) guidance system for visualization of needles and report on the implementation of two automatic needle segmentation algorithms to aid the localization of needles intraoperatively. Two-dimensional (2D) needle segmentation, allowing for immediate adjustments to needle trajectories to mitigate needle deflection and avoid OARs, was implemented in near real-time using a method based on a convolutional neural network with a U-Net architecture trained on a dataset of 2D ultrasound images from multiple applications with needle-like structures. In 18 unseen TVUS images, the median position difference [95% confidence interval] was 0.27 [0.20, 0.68] mm and mean angular difference was 0.50 [0.27, 1.16]° between manually and algorithmically segmented needles. Automatic needle segmentation was performed in 3D TVUS images using an algorithm leveraging the randomized 3D Hough transform. All needles were accurately localized in a proof-of-concept image with a median position difference of 0.79 [0.62, 0.93] mm and median angular difference of 0.46 [0.31, 0.62]°, when compared to manual segmentations. Further investigation into the robustness of the algorithm to complex cases containing large shadowing, air, or reverberation artefacts is ongoing. Intraoperative automatic needle segmentation in interstitial gynecologic brachytherapy has the potential to improve implant quality and provides the potential for 3D ultrasound to be used for treatment planning, eliminating the requirement for post-insertion CT scans.
KEYWORDS: 3D image processing, 3D acquisition, Computed tomography, Visualization, Tumors, Ultrasonography, Image visualization, 3D image reconstruction, High dynamic range imaging, Imaging systems
Brachytherapy, a type of radiotherapy, may be used to place radioactive sources into or in close proximity to tumors, providing a method for conformally escalating dose in the tumor and the local area surrounding the malignancy. High-dose-rate interstitial brachytherapy of vaginal tumors requires precise placement of multiple needles through holes in a plastic perineal template to deliver treatment while optimizing dose and avoiding overexposure of nearby organs at risk (OARs). Despite the importance of needle placement, image guidance for adaptive, intraoperative needle visualization, allowing misdirected needles to be identified and corrected during insertion, is not standard practice. We have developed a 360-deg three-dimensional (3-D) transvaginal ultrasound (TVUS) system using a conventional probe with a template-compatible custom sonolucent vaginal cylinder and propose its use for intraoperative needle guidance during interstitial gynecologic brachytherapy. We describe the 3-D TVUS mechanism and geometric validation, present mock phantom procedure results, and report on needle localization accuracy in patients. For the six patients imaged, landmark anatomical features and all needles were clearly visible. The implementation of 360-deg 3-D TVUS through a sonolucent vaginal cylinder provides a technique for visualizing needles and OARs intraoperatively during interstitial gynecologic brachytherapy, enabling implants to be assessed and providing the potential for image guidance.
During high-dose-rate (HDR) interstitial brachytherapy of gynecologic malignancies, precise placement of multiple needles is necessary to provide optimal dose to the tumor while avoiding overexposing nearby healthy organs, such as the bladder and rectum. Needles are currently placed based on preoperative imaging and clinical examination but there is currently no standard for intraoperative image guidance. We propose the use of a three-dimensional (3D) ultrasound (US) system incorporating three scanning geometries: 3D transrectal US (TRUS), 360° 3D sidefire transvaginal US (TVUS), and 3D endfire TVUS, to provide an accessible and versatile tool for intraoperative image guidance during interstitial gynecologic brachytherapy. Images are generated in 12 - 20 s by rotating a conventional two-dimensional US probe, providing a reconstructed 3D image immediately following acquisition. Studies of needles in patient images show mean differences in needle positions of 3.82 ± 1.86 mm and 2.36 ± 0.97 mm in TRUS and sidefire TVUS, respectively, when compared to the clinical x-ray computed tomography (CT) images. A proof-of-concept phantom study of the endfire TVUS mode demonstrated a mean positional difference of 1.91 ± 0.24 mm. Additionally, an automatic needle segmentation tool was tested on a 360° 3D TVUS patient image resulting in a mean angular difference of 0.44 ± 0.19 ° and mean positional difference of 0.78 ± 0.17 mm when compared to manually segmented needles. The implementation of 3D US image guidance during HDR interstitial gynecologic brachytherapy provides a versatile intraoperative system with the potential for improved implant quality and reduced risk to nearby organs.
In high-dose-rate (HDR) interstitial gynecologic brachytherapy, needles are positioned into the tumor and surrounding area through a template to deliver radiotherapy. Optimal dose and avoidance of nearby organs requires precise needle placement; however, there is currently no standard method for intra-operative needle visualization or guidance. We have developed and validated a 360° three-dimensional (3D) transvaginal ultrasound (TVUS) system and created a sonolucent vaginal cylinder that is compatible with the current template to accommodate a conventional side-fire ultrasound probe. This probe is rotated inside the hollow sonolucent cylinder to generate a 3D image. We propose the use of this device for intra-operative verification of brachytherapy needle locations. In a feasibility study, the first ever 360° 3D TVUS image of a gynecologic brachytherapy patient was acquired and the image allowed key features, including bladder, rectum, vaginal wall, and bowel, to be visualized with needles clearly identifiable. Three patients were then imaged following needle insertion (28 needles total) and positions of the needles in the 3D TVUS image were compared to the clinical x-ray computed tomography (CT) image, yielding a mean trajectory difference of 1.67 ± 0.75°. The first and last visible points on each needle were selected in each modality and compared; the point pair with the larger distance was selected as the maximum difference in needle position with a mean maximum difference of 2.33 ± 0.78 mm. This study demonstrates that 360° 3D TVUS may be a feasible approach for intra-operative needle localization during HDR interstitial brachytherapy of gynecologic malignancies.
KEYWORDS: Breast, 3D image processing, Ultrasonography, System integration, Imaging systems, 3D scanning, Radiotherapy, Breast cancer, Image guided radiation therapy
Permanent breast seed implantation (PBSI) is a single visit accelerated partial breast irradiation method that uses needles inserted via a template to distribute Pd-103 radioactive seeds with two-dimensional (2D) ultrasound (US) guidance. This guidance approach is limited by its dependence on the operator and average seed placement errors greater than benchmark values established by dosimetric studies. We propose the use of a three-dimensional (3D) US imaging approach for needle guidance with integrated template tracking. We previously described the preliminary development and validation of the 3D US mechatronic system. The present work demonstrates the accuracy of the integrated system by quantifying agreement between tracking and imaging sub-systems and its use guiding a phantom procedure. Tracking error was measured by inserting a needle a known distance through the template and comparing expected tip position from tracking to observed tip position from imaging. Mean ± standard deviation differences in needle tip position and angle were 2.90 ± 0.76 mm and 1.77 ± 0.98°, respectively, validating the needle tracking accuracy of the developed system. The system was used to guide 15 needles into a patient-specific phantom according to the accompanying treatment plan and micro-CT images taken before and after to evaluate placement accuracy. Seed positions were modelled using needle positions and the resulting dosimetry compared to a procedure specific benchmark. The mean tip difference was 2.08 mm while the mean angular difference was 2.6°, resulting in acceptable dosimetric coverage. These results demonstrate 3D US as a potentially feasible technique for PBSI guidance.
KEYWORDS: Ultrasonography, 3D image processing, Visualization, Distance measurement, Cancer, Oncology, 3D acquisition, Magnetic resonance imaging, Imaging systems, 3D scanning
Treatment for gynaecological cancers often includes brachytherapy; in particular, in high-dose-rate (HDR) interstitial
brachytherapy, hollow needles are inserted into the tumour and surrounding area through a template in order to deliver the
radiation dose. Currently, there is no standard modality for visualizing needles intra-operatively, despite the need for
precise needle placement in order to deliver the optimal dose and avoid nearby organs, including the bladder and rectum.
While three-dimensional (3D) transrectal ultrasound (TRUS) imaging has been proposed for 3D intra-operative needle
guidance, anterior needles tend to be obscured by shadowing created by the template’s vaginal cylinder. We have
developed a 360-degree 3D transvaginal ultrasound (TVUS) system that uses a conventional two-dimensional side-fire
TRUS probe rotated inside a hollow vaginal cylinder made from a sonolucent plastic (TPX). The system was validated
using grid and sphere phantoms in order to test the geometric accuracy of the distance and volumetric measurements in
the reconstructed image. To test the potential for visualizing needles, an agar phantom mimicking the geometry of the
female pelvis was used. Needles were inserted into the phantom and then imaged using the 3D TVUS system. The needle
trajectories and tip positions in the 3D TVUS scan were compared to their expected values and the needle tracks visualized
in magnetic resonance images. Based on this initial study, 360-degree 3D TVUS imaging through a sonolucent vaginal
cylinder is a feasible technique for intra-operatively visualizing needles during HDR interstitial gynaecological
brachytherapy.
KEYWORDS: Ultrasonography, Cancer, Visualization, Real time imaging, Rectum, 3D image processing, Computed tomography, Transducers, 3D metrology, 3D scanning
High-dose-rate (HDR) interstitial brachytherapy is often included in standard-of-care for gynaecological cancers. Needles are currently inserted through a perineal template without any standard real-time imaging modality to assist needle guidance, causing physicians to rely on pre-operative imaging, clinical examination, and experience. While two-dimensional (2D) ultrasound (US) is sometimes used for real-time guidance, visualization of needle placement and depth is difficult and subject to variability and inaccuracy in 2D images. The close proximity to critical organs, in particular the rectum and bladder, can lead to serious complications. We have developed a three-dimensional (3D) transrectal US system and are investigating its use for intra-operative visualization of needle positions used in HDR gynaecological brachytherapy. As a proof-of-concept, four patients were imaged with post-insertion 3D US and x-ray CT. Using software developed in our laboratory, manual rigid registration of the two modalities was performed based on the perineal template’s vaginal cylinder. The needle tip and a second point along the needle path were identified for each needle visible in US. The difference between modalities in the needle trajectory and needle tip position was calculated for each identified needle. For the 60 needles placed, the mean trajectory difference was 3.23 ± 1.65° across the 53 visible needle paths and the mean difference in needle tip position was 3.89 ± 1.92 mm across the 48 visible needles tips. Based on the preliminary results, 3D transrectal US shows potential for the development of a 3D US-based needle guidance system for interstitial gynaecological brachytherapy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.