Share Email Print
cover

Proceedings Paper

Soft tissue navigation for laparoscopic prostatectomy: evaluation of camera pose estimation for enhanced visualization
Author(s): M. Baumhauer; T. Simpfendörfer; R. Schwarz; M. Seitel; B. P. Müller-Stich; C. N. Gutt; J. Rassweiler; H.-P. Meinzer; I. Wolf
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

We introduce a novel navigation system to support minimally invasive prostate surgery. The system utilizes transrectal ultrasonography (TRUS) and needle-shaped navigation aids to visualize hidden structures via Augmented Reality. During the intervention, the navigation aids are segmented once from a 3D TRUS dataset and subsequently tracked by the endoscope camera. Camera Pose Estimation methods directly determine position and orientation of the camera in relation to the navigation aids. Accordingly, our system does not require any external tracking device for registration of endoscope camera and ultrasonography probe. In addition to a preoperative planning step in which the navigation targets are defined, the procedure consists of two main steps which are carried out during the intervention: First, the preoperatively prepared planning data is registered with an intraoperatively acquired 3D TRUS dataset and the segmented navigation aids. Second, the navigation aids are continuously tracked by the endoscope camera. The camera's pose can thereby be derived and relevant medical structures can be superimposed on the video image. This paper focuses on the latter step. We have implemented several promising real-time algorithms and incorporated them into the Open Source Toolkit MITK (www.mitk.org). Furthermore, we have evaluated them for minimally invasive surgery (MIS) navigation scenarios. For this purpose, a virtual evaluation environment has been developed, which allows for the simulation of navigation targets and navigation aids, including their measurement errors. Besides evaluating the accuracy of the computed pose, we have analyzed the impact of an inaccurate pose and the resulting displacement of navigation targets in Augmented Reality.

Paper Details

Date Published: 21 March 2007
PDF: 12 pages
Proc. SPIE 6509, Medical Imaging 2007: Visualization and Image-Guided Procedures, 650911 (21 March 2007); doi: 10.1117/12.709655
Show Author Affiliations
M. Baumhauer, German Cancer Research Ctr. (Germany)
T. Simpfendörfer, Univ. of Heidelberg (Germany)
R. Schwarz, German Cancer Research Ctr. (Germany)
M. Seitel, German Cancer Research Ctr. (Germany)
B. P. Müller-Stich, Univ. of Heidelberg (Germany)
C. N. Gutt, Univ. of Heidelberg (Germany)
J. Rassweiler, Urological Clinic Heilbronn (Germany)
H.-P. Meinzer, German Cancer Research Ctr. (Germany)
I. Wolf, German Cancer Research Ctr. (Germany)


Published in SPIE Proceedings Vol. 6509:
Medical Imaging 2007: Visualization and Image-Guided Procedures
Kevin R. Cleary; Michael I. Miga, Editor(s)

© SPIE. Terms of Use
Back to Top