
Proceedings Paper
Augmented reality for biomedical wellness sensor systemsFormat | Member Price | Non-Member Price |
---|---|---|
$17.00 | $21.00 |
Paper Abstract
Due to the commercial move and gaming industries, Augmented Reality (AR) technology has matured. By definition of AR, both
artificial and real humans can be simultaneously present and realistically interact among one another. With the help of physics
and physiology, we can build in the AR tool together with real human day-night webcam inputs through a simple interaction of
heat transfer –getting hot, action and reaction –walking or falling, as well as the physiology –sweating due to activity. Knowing
the person age, weight and 3D coordinates of joints in the body, we deduce the force, the torque, and the energy expenditure
during real human movements and apply to an AR human model. We wish to support the physics-physiology AR version, PPAR,
as a BMW surveillance tool for senior home alone (SHA). The functionality is to record senior walking and hand
movements inside a home environment. Besides the fringe benefit of enabling more visits from grand children through AR video
games, the PP-AR surveillance tool may serve as a means to screen patients in the home for potential falls at points around in
house. Moreover, we anticipate PP-AR may help analyze the behavior history of SHA, e.g. enhancing the Smartphone SHA
Ubiquitous Care Program, by discovering early symptoms of candidate Alzheimer-like midnight excursions, or Parkinson-like
trembling motion for when performing challenging muscular joint movements. Using a set of coordinates corresponding to a set
of 3D positions representing human joint locations, we compute the Kinetic Energy (KE) generated by each body segment over
time. The Work is then calculated, and converted into calories. Using common graphics rendering pipelines, one could invoke
AR technology to provide more information about patients to caretakers. Alerts to caretakers can be prompted by a patient’s
departure from their personal baseline, and the patient’s time ordered joint information can be loaded to a graphics viewer
allowing for high-definition digital reconstruction. Then an entire scene can be viewed from any position in virtual space, and
AR can display certain measurements values which either constituted an alert, or otherwise indicate signs of the transition from
wellness to illness.
Paper Details
Date Published: 29 May 2013
PDF: 9 pages
Proc. SPIE 8750, Independent Component Analyses, Compressive Sampling, Wavelets, Neural Net, Biosystems, and Nanoengineering XI, 875017 (29 May 2013); doi: 10.1117/12.2018409
Published in SPIE Proceedings Vol. 8750:
Independent Component Analyses, Compressive Sampling, Wavelets, Neural Net, Biosystems, and Nanoengineering XI
Harold H. Szu, Editor(s)
PDF: 9 pages
Proc. SPIE 8750, Independent Component Analyses, Compressive Sampling, Wavelets, Neural Net, Biosystems, and Nanoengineering XI, 875017 (29 May 2013); doi: 10.1117/12.2018409
Show Author Affiliations
Jeffrey Jenkins, George Mason Univ. (United States)
Harold Szu, The Catholic Univ. of America (United States)
Published in SPIE Proceedings Vol. 8750:
Independent Component Analyses, Compressive Sampling, Wavelets, Neural Net, Biosystems, and Nanoengineering XI
Harold H. Szu, Editor(s)
© SPIE. Terms of Use
