Share Email Print
cover

Proceedings Paper

Shared visualizations and guided procedure simulation in augmented reality with Microsoft HoloLens
Author(s): Lawrence Huang; Scott Collins; Leo Kobayashi; Derek Merck; Thomas Sgouros
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Background: Current display technologies can be suboptimal, even inadequate, when conveying spatially-complex healthcare concepts to patients and providers. This can lead to difficulties in sharing medical information, with potentially deleterious effects on patient outcomes, research, and trainee education. Methods: Investigators used off-the-shelf augmented reality (AR) technologies to facilitate visual communication for healthcare. Using untethered headset devices (Microsoft HoloLens), proof-of-concept work was completed for two use-cases: 1.) multi-user shared AR visualizations and 2.) AR-guided invasive procedural performance. The research team collaborated to create: 1.) a shared AR environment that enabled multiple users to independently visualize and manipulate 3D patient anatomic models with position, rotation, and scale synchronized across users; and 2.) a hybrid [AR-physical] covered box configuration containing CT-scanned targets and custom trajectory guidance system for simulated needle aspiration. As a pilot study exploring technical feasibility and experimental viability of the selected approach, measurements of 1.) size, displacement, angular rotation, and 2.) expert aspiration success were used as primary metrics. Results: The mean difference between AR models and physical objects was 2.0±0.4% and 1.7±0.4% of all dimensions on two HoloLens devices. One shared model configuration exhibited deviations of 7.8±2.8mm in location and 0.5±0.9° in angular orientation, and another showed differences of 6.5±2.1mm and 0.1±0.7° . For AR-guided procedure simulations, two expert surgeons required 3 attempts in 10 minutes and 1 attempt in 3 minutes to successfully aspirate the hybrid targets. Conclusion: AR technologies were used to enable core elements of an interactive shared medical visualization environment and a guided procedure simulation.

Paper Details

Date Published: 8 March 2019
PDF: 6 pages
Proc. SPIE 10951, Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling, 1095112 (8 March 2019); doi: 10.1117/12.2511321
Show Author Affiliations
Lawrence Huang, Brown Univ. (United States)
Scott Collins, Rhode Island Hospital (United States)
Leo Kobayashi, Rhode Island Hospital (United States)
Derek Merck, Rhode Island Hospital (United States)
Thomas Sgouros, Brown Univ. (United States)


Published in SPIE Proceedings Vol. 10951:
Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling
Baowei Fei; Cristian A. Linte, Editor(s)

© SPIE. Terms of Use
Back to Top
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?
close_icon_gray