Share Email Print

Proceedings Paper

Deep learning-based depth estimation from a synthetic endoscopy image training set
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Colorectal cancer is the fourth leading cause of cancer deaths worldwide. The detection and removal of premalignant lesions through an endoscopic colonoscopy is the most effective way to reduce colorectal cancer mortality. Unfortunately, conventional colonoscopy has an almost 25% polyp miss rate, in part due to the lack of depth information and contrast of the surface of the colon. Estimating depth using conventional hardware and software methods is challenging in endoscopy due to limited endoscope size and deformable mucosa. In this work, we use a joint deep learning and graphical model-based framework for depth estimation from endoscopy images. Since depth is an inherently continuous property of an object, it can easily be posed as a continuous graphical learning problem. Unlike previous approaches, this method does not require hand-crafted features. Large amounts of augmented data are required to train such a framework. Since there is limited availability of colonoscopy images with ground-truth depth maps and colon texture is highly patient-specific, we generated training images using a synthetic, texture-free colon phantom to train our models. Initial results show that our system can estimate depths for phantom test data with a relative error of 0.164. The resulting depth maps could prove valuable for 3D reconstruction and automated Computer Aided Detection (CAD) to assist in identifying lesions.

Paper Details

Date Published: 2 March 2018
PDF: 6 pages
Proc. SPIE 10574, Medical Imaging 2018: Image Processing, 1057421 (2 March 2018); doi: 10.1117/12.2293785
Show Author Affiliations
Faisal Mahmood, Johns Hopkins Univ. (United States)
Nicholas J. Durr, Johns Hopkins Univ. (United States)

Published in SPIE Proceedings Vol. 10574:
Medical Imaging 2018: Image Processing
Elsa D. Angelini; Bennett A. Landman, Editor(s)

© SPIE. Terms of Use
Back to Top