Share Email Print
cover

Proceedings Paper

An adversarial machine-learning-based approach and biomechanically guided validation for improving deformable image registration accuracy between a planning CT and cone-beam CT for adaptive prostate radiotherapy applications
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Adaptive radiotherapy is an effective procedure for the treatment of cancer, where the daily anatomical changes in the patient are quantified, and the dose delivered to the tumor is adapted accordingly. Deformable Image Registration (DIR) inaccuracies and delays in retrieving and registering on-board cone beam CT (CBCT) image datasets from the treatment system with the planning kilo Voltage CT (kVCT) have limited the adaptive workflow to a limited number of patients. In this paper, we present an approach for improving the DIR accuracy using a machine learning approach coupled with biomechanically guided validation. For a given set of 11 planning prostate kVCT datasets and their segmented contours, we first assembled a biomechanical model to generate synthetic abdominal motions, bladder volume changes, and physiological regression. For each of the synthetic CT datasets, we then injected noise and artifacts in the images using a novel procedure in order to mimic closely CBCT datasets. We then considered the simulated CBCT images for training neural networks that predicted the noise and artifact-removed CT images. For this purpose, we employed a constrained generative adversarial neural network, which consisted of two deep neural networks, a generator and a discriminator. The generator produced the artifact-removed CT images while the discriminator computed the accuracy. The deformable image registration (DIR) results were finally validated using the model-generated landmarks. Results showed that the artifact-removed CT matched closely to the planning CT. Comparisons were performed using the image similarity metrics, and a normalized cross correlation of >0.95 was obtained from the cGAN based image enhancement. In addition, when DIR was performed, the landmarks matched within 1.1 +/- 0.5 mm. This demonstrates that using an adversarial DNN-based CBCT enhancement, improved DIR accuracy bolsters adaptive radiotherapy workflow.

Paper Details

Date Published: 10 March 2020
PDF: 10 pages
Proc. SPIE 11313, Medical Imaging 2020: Image Processing, 113130P (10 March 2020); doi: 10.1117/12.2550493
Show Author Affiliations
Anand P. Santhanam, Univ. of California, Los Angeles (United States)
Michael Lauria, Univ. of California, Los Angeles (United States)
Brad Stiehl, Univ. of California, Los Angeles (United States)
Daniel Elliott, SegAna, LLC (United States)
Saty Seshan, SegAna, LLC (United States)
Scott Hsieh, Univ. of California, Los Angeles (United States)
Minsong Cao, Univ. of California, Los Angeles (United States)
Daniel Low, Univ. of California, Los Angeles (United States)


Published in SPIE Proceedings Vol. 11313:
Medical Imaging 2020: Image Processing
Ivana Išgum; Bennett A. Landman, Editor(s)

© SPIE. Terms of Use
Back to Top
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?
close_icon_gray