Real-time fusion of endoscopic views with dynamic 3-D cardiac images: a phantom study
Journal Articles
Overview
Research
Identity
Additional Document Info
View All
Overview
abstract
Minimally invasive robotically assisted cardiac surgical systems currently do not routinely employ 3-D image guidance. However, preoperative magnetic resonance and computed tomography (CT) images have the potential to be used in this role, if appropriately registered with the patient anatomy and animated synchronously with the motion of the actual heart. This paper discusses the fusion of optical images of a beating heart phantom obtained from an optically tracked endoscope, with volumetric images of the phantom created from a dynamic CT dataset. High quality preoperative dynamic CT images are created by first extracting the motion parameters of the heart from the series of temporal frames, and then applying this information to animate a high-quality heart image acquired at end systole. Temporal synchronization of the endoscopic and CT model is achieved by selecting the appropriate CT image from the dynamic set, based on an electrocardiographic trigger signal. The spatial error between the optical and virtual images is 1.4 +/- 1.1 mm, while the time discrepancy is typically 50-100 ms. Index Terms-Image guidance, image warping, minimally invasive cardiac surgery, virtual endoscopy, virtual reality.