Surgical procedures often have the unfortunate side-effect of causing the patient significant trauma while accessing
the target site. Indeed, in some cases the trauma inflicted on the patient during access to the target greatly
exceeds that caused by performing the therapy. Heart disease has traditionally been treated surgically using
open chest techniques with the patient being placed "on pump" - i.e. their circulation being maintained by
a cardio-pulmonary bypass or "heart-lung" machine. Recently, techniques have been developed for performing
minimally invasive interventions on the heart, obviating the formerly invasive procedures. These new approaches
rely on pre-operative images, combined with real-time images acquired during the procedure. Our approach
is to register intra-operative images to the patient, and use a navigation system that combines intra-operative
ultrasound with virtual models of instrumentation that has been introduced into the chamber through the heart
wall. This paper illustrates the problems associated with traditional ultrasound guidance, and reviews the state
of the art in real-time 3D cardiac ultrasound technology. In addition, it discusses the implementation of an image-guided
intervention platform that integrates real-time ultrasound with a virtual reality environment, bringing
together the pre-operative anatomy derived from MRI or CT, representations of tracked instrumentation inside
the heart chamber, and the intra-operatively acquired ultrasound images.
Image-guided interventions rely on the common assumption that pre-operative information can depict intraoperative
morphology with sufficient accuracy. Nevertheless, in the context of minimally invasive cardiac therapy
delivery, this assumption loses ground; the heart is a soft-tissue organ prone to changes induced during access to
the heart and especially intracardiac targets. In addition to its clinical value for cardiac interventional guidance
and assistance with the image- and model-to-patient registration, here we show how ultrasound imaging may be
used to estimate changes in the heart position and morphology of structures of interest at different stages in the
procedure. Using a magnetically tracked 2D transesophageal echocardiography transducer, we acquired in vivo
images of the heart at different stages during the procedural workflow of common minimally invasive cardiac
procedures, including robot-assisted coronary artery bypass grafting, mitral valve replacement/repair, or modelenhanced
US-guided intracardiac interventions, all in the coordinate system of the tracking system. Anatomical
features of interest (mitral and aortic valves) used to register the pre-operative anatomical models to the intraoperative
coordinate frame were identified from each dataset. This information allowed us to identify the global
position of the heart and also characterize the valvular structures at various peri-operative stages, in terms of
their orientation, size, and geometry. Based on these results, we can estimate the differences between the preand
intra-operative anatomical features, their effect on the model-to-subject registration, and also identify the
need to update or optimize any pre-operative surgical plan to better suit the intra-operative procedure workflow.
In this paper, we present an open-source framework for testing tracking devices in surgical
navigation applications. At the core of image-guided intervention systems is the tracking interface
that handles communication with the tracking device and gathers tracking information. Given that
the correctness of tracking information is critical for protecting patient safety and for ensuring the
successful execution of an intervention, the tracking software component needs to be thoroughly
tested on a regular basis. Furthermore, with widespread use of extreme programming methodology
that emphasizes continuous and incremental testing of application components, testing design
becomes critical. While it is easy to automate most of the testing process, it is often more difficult to
test components that require manual intervention such as tracking device.
Our framework consists of a robotic arm built from a set of Lego Mindstorms and an open-source
toolkit written in C++ to control the robot movements and assess the accuracy of the tracking
devices. The application program interface (API) is cross-platform and runs on Windows, Linux and
MacOS.
We applied this framework for the continuous testing of the Image-Guided Surgery Toolkit
(IGSTK), an open-source toolkit for image-guided surgery and shown that regression testing on
tracking devices can be performed at low cost and improve significantly the quality of the software.
Ultrasound is garnering significant interest as an imaging modality for surgical guidance, due to its affordability,
real-time temporal resolution and ease of integration into the operating room. Minimally-invasive intracardiac
surgery performed on the beating-heart prevents direct vision of the surgical target, and procedures such as
mitral valve replacement and atrial septal defect closure would benefit from intraoperative ultrasound imaging.
We propose that placing 4D ultrasound within an augmented reality environment, along with a patient-specific
cardiac model and virtual representations of tracked surgical tools, will create a visually intuitive platform with
sufficient image information to safely and accurately repair tissue within the beating heart. However, the quality
of the imaging parameters, spatial calibration, temporal calibration and ECG-gating must be well characterized
before any 4D ultrasound system can be used clinically to guide the treatment of moving structures. In this paper,
we describe a comprehensive accuracy assessment framework that can be used to evaluate the performance of 4D
ultrasound systems while imaging moving targets. We image a dynamic phantom that is comprised of a simple
robot and a tracked phantom to which point-source, distance and spherical objects of known construction can be
attached. We also follow our protocol to evaluate 4D ultrasound images generated in real-time by reconstructing
ECG-gated 2D ultrasound images acquired from a tracked multiplanar transesophageal probe. Likewise, our
evaluation framework allows any type of 4D ultrasound to be quantitatively assessed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.