Share Email Print
cover

Proceedings Paper • new

Development of an augmented reality approach to mammographic training: overcoming some real world challenges
Author(s): Qiang Tang; Yan Chen; Gerald Schaefer; Alastair G. Gale
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

A dedicated workstation and its corresponding viewing software are essential requirements in breast screener training. A major challenge of developing further generic screener training technology (in particular, for mammographic interpretation training) is that high-resolution radiological images are required to be displayed on dedicated workstations whilst actual reporting of the images is generally completed on individual standard workstations. Due to commercial reasons, dedicated clinical workstations manufactured by leading international vendors tend not to have critical technical aspects divulged which would facilitate further integration of third party generic screener training technology. With standard workstations, it is noticeable that the conventional screener training depends highly on manual transcription so that traditional training methods can potentially be deficient in terms of real-time feedback and interaction. Augmented reality (AR) provides the ability to co-operate with both real and virtual environments, and therefore can supplement conventional training with virtual registered objects and actions. As a result, realistic screener training can co-operate with rich feedback and interaction in real time. Previous work1 has shown that it is feasible to employ an AR approach to deliver workstation-independent radiological screening training by superimposing appropriate feedback coupled with the use of interaction interfaces. The previous study addressed presence issues and provided an AR recognisable stylus which allowed for drawing interaction. As a follow-up, this study extends the AR method and investigates realistic effects and the impacts of environmental illumination, application performance and transcription. A robust stylus calibration method is introduced to address environmental changes over time. Moreover, this work introduces a completed AR workflow which allows real time recording, computer analysable training data and further recoverable transcription during post-training. A quantitative evaluation results show an accuracy of more than 80% of user-drawn points being located within a pixel distance of 5.

Paper Details

Date Published: 13 March 2018
PDF: 7 pages
Proc. SPIE 10576, Medical Imaging 2018: Image-Guided Procedures, Robotic Interventions, and Modeling, 105762M (13 March 2018); doi: 10.1117/12.2293496
Show Author Affiliations
Qiang Tang, Loughborough Univ. (United Kingdom)
Yan Chen, Loughborough Univ. (United Kingdom)
Gerald Schaefer, Loughborough Univ. (United Kingdom)
Alastair G. Gale, Loughborough Univ. (United Kingdom)


Published in SPIE Proceedings Vol. 10576:
Medical Imaging 2018: Image-Guided Procedures, Robotic Interventions, and Modeling
Baowei Fei; Robert J. Webster, Editor(s)

© SPIE. Terms of Use
Back to Top