Share Email Print
cover

Proceedings Paper

A Windows GUI application for real-time image guidance during motion-managed proton beam therapy
Author(s): Zheng Zhang; Chris Beltran; Stephen M. Corner; Amanda J. Deisher; Michael G. Herman; Jon J. Kruse; Hok Seum Wan Chan Tseung; Erik J. Tryggestad
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Respiratory motion management is crucial for maintaining robustness of the delivered dose in radiotherapy. This is particularly relevant for spot-scanned proton therapy, where motion-induced dosimetric “interplay” effects can severely perturb the planned dose distribution. Our proton therapy vendor developed a stereoscopic kV x-ray image guidance platform along with a 3D/2D image matching algorithm for 6 degreeof-freedom patient positioning with a robotic couch (6DOF couch). However, this vendor-provided solution lacks the capability to adequately handle real-time kV fluoroscopy, which is crucial for aspects of motion management. To address this clinical gap, we augmented vendor’s system with a custom signal processing pathway to passively listening for flat-panel detector (FPD) data stream (Camera Link) and handle fluoroscopic frames independently in real time. Additionally, we built a novel calibration phantom and the accompanying room-geometry-specific calibration routine for projective overlay of DICOM-RT structures onto the 2D FPD frames. Because our calibration routine has been developed independently, this tool may also serve as an independent means to test and validate the vendor’s imaging geometry calibration. We developed a Windows-based application in .NET/C# to drive all data acquisition and processing. Having DICOM integration with our treatment planning infrastructure, this therapy tool automatically archives clinical x-ray data to a HIPAA-compliant cloud, and therefore serves as a data interface to retrieve previously recorded x-ray images and cine video streams. This functions as a platform for image guidance research in the future. The next goal on our roadmap is to develop deep-learning methods for real-time soft-tissue-based tumor tracking.

Paper Details

Date Published: 16 March 2020
PDF: 13 pages
Proc. SPIE 11315, Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions, and Modeling, 113152F (16 March 2020); doi: 10.1117/12.2549748
Show Author Affiliations
Zheng Zhang, Mayo Clinic Rochester (United States)
Chris Beltran, Mayo Clinic Rochester (United States)
Stephen M. Corner, Mayo Clinic Rochester (United States)
Amanda J. Deisher, Mayo Clinic Rochester (United States)
Michael G. Herman, Mayo Clinic Rochester (United States)
Jon J. Kruse, Mayo Clinic Rochester (United States)
Hok Seum Wan Chan Tseung, Mayo Clinic Rochester (United States)
Erik J. Tryggestad, Mayo Clinic Rochester (United States)


Published in SPIE Proceedings Vol. 11315:
Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions, and Modeling
Baowei Fei; Cristian A. Linte, Editor(s)

© SPIE. Terms of Use
Back to Top
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?
close_icon_gray