SignificanceHolographic display technology is a promising area of research that can lead to significant advancements in cancer surgery. We present the benefits of combining bioinspired multispectral imaging technology with holographic goggles for fluorescence-guided cancer surgery. Through a series of experiments with 43D-printed phantoms, small animal models of cancer, and surgeries on canine patients with head and neck cancer, we showcase the advantages of this holistic approach.AimThe aim of our study is to demonstrate the feasibility and potential benefits of utilizing holographic display for fluorescence-guided surgery through a series of experiments involving 3D-printed phantoms and canine patients with head and neck cancer.ApproachWe explore the integration of a bioinspired camera with a mixed reality headset to project fluorescent images as holograms onto a see-through display, and we demonstrate the potential benefits of this technology through benchtop and in vivo animal studies.ResultsOur complete imaging and holographic display system showcased improved delineation of fluorescent targets in phantoms compared with the 2D monitor display approach and easy integration into the veterinarian surgical workflow.ConclusionsBased on our findings, it is evident that our comprehensive approach, which combines a bioinspired multispectral imaging sensor with holographic goggles, holds promise in enhancing the presentation of fluorescent information to surgeons during intraoperative scenarios while minimizing disruptions.
Near-infrared fluorescence image-guided surgery offers substantial benefits for both surgeons and patients, but high-performance near-infrared fluorescent molecular markers are not available for clinical application. To enable adoption of a wider variety of fluorescent markers, we have developed a single-chip snapshot multispectral imaging system that combines a scientific CMOS image sensor with high quantum efficiency (94%), low read noise (1.6 e-), and high dynamic range (91 dB) and pixelated interference filters that enable color/near-infrared imaging without co-registration error The fully integrated imaging system is currently being evaluated for tumor detection with targeted probes and sentinel lymph node detection with indocyanine green.
Surgery is the primary curative option for patients with cancer, with the overall objective of complete resection of all cancerous tissue while avoiding iatrogenic damage to healthy tissue. Simultaneous imaging of weak fluorescence signals from multiple targeted molecular markers under bright surgical illumination is an unmet goal with current intra operating instrument. In this talk, I will describe our recent efforts in solving this intraoperative challenge by drawing inspiration from the visual system of the mantis shrimp – a compact biological system optimized for multispectral imaging. We have successfully designed, tested and clinically translated our bio-inspired imagers by monolithically integrating vertically stacked photodetectors with pixelated interference filters. The sensor is capable of recording color and NIR fluorescence from three different molecular markers and display this information using augmented reality goggles. The sensor resolution is 1280 by 720 and operates at 30 frames per second and has been used to simultaneously image tumor targeted dye IR800 and nerve targeted dye, Oxazine-4. Displaying this information in the operating room is a challenging feat. We have used variety of augmented reality displays and will provide overview of both pre-clinical and clinical translation of this technology.
Near infrared fluorescence (NIRF) based image guided surgery aims to provide vital information to the surgeon in the
operating room, such as locations of cancerous tissue that should be resected and healthy tissue that should to be
preserved. Targeted molecular markers, such as tumor or nerve specific probes, are used in conjunctions with NIRF
imaging and display systems to provide key information to the operator in real-time. One of the major hurdles for the
wide adaptation of these imaging systems is the high cost to operate the instruments, large footprint and complexity of
operating the systems. The emergence of wearable NIRF systems has addressed these shortcomings by minimizing the
imaging and display systems’ footprint and reducing the operational cost. However, one of the major shortcomings for
this technology is the replacement of the surgeon’s natural vision with an augmented reality view of the operating room.
In this paper, we have addressed this major shortcoming by exploiting hologram technology from Microsoft HoloLens to
present NIR information on a color image captured by the surgeon’s natural vision. NIR information is captured with a
CMOS sensor with high quantum efficiency in the 800 nm wavelength together with a laser light illumination light
source. The NIR image is converted to a hologram that is displayed on Microsoft HoloLens and is correctly co-registered
with the operator’s natural eyesight.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.