Share Email Print

Proceedings Paper

Mobile app for human-interaction with sitter robots
Author(s): Sumit Kumar Das; Ankita Sahu; Dan O. Popa
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Human environments are often unstructured and unpredictable, thus making the autonomous operation of robots in such environments is very difficult. Despite many remaining challenges in perception, learning, and manipulation, more and more studies involving assistive robots have been carried out in recent years. In hospital environments, and in particular in patient rooms, there are well-established practices with respect to the type of furniture, patient services, and schedule of interventions. As a result, adding a robot into semi-structured hospital environments is an easier problem to tackle, with results that could have positive benefits to the quality of patient care and the help that robots can offer to nursing staff. When working in a healthcare facility, robots need to interact with patients and nurses through Human-Machine Interfaces (HMIs) that are intuitive to use, they should maintain awareness of surroundings, and offer safety guarantees for humans. While fully autonomous operation for robots is not yet technically feasible, direct teleoperation control of the robot would also be extremely cumbersome, as it requires expert user skills, and levels of concentration not available to many patients.

Therefore, in our current study we present a traded control scheme, in which the robot and human both perform expert tasks. The human-robot communication and control scheme is realized through a mobile tablet app that can be customized for robot sitters in hospital environments. The role of the mobile app is to augment the verbal commands given to a robot through natural speech, camera and other native interfaces, while providing failure mode recovery options for users. Our app can access video feed and sensor data from robots, assist the user with decision making during pick and place operations, monitor the user health over time, and provides conversational dialogue during sitting sessions. In this paper, we present the software and hardware framework that enable a patient sitter HMI, and we include experimental results with a small number of users that demonstrate that the concept is sound and scalable.

Paper Details

Date Published: 16 May 2017
Proc. SPIE 10216, Smart Biomedical and Physiological Sensor Technology XIV, 102160D (16 May 2017); doi: 10.1117/12.2262792
Show Author Affiliations
Sumit Kumar Das, Univ. of Louisville (United States)
Ankita Sahu, Univ. of Louisville (United States)
Dan O. Popa, Univ. of Louisville (United States)

Published in SPIE Proceedings Vol. 10216:
Smart Biomedical and Physiological Sensor Technology XIV
Brian M. Cullum; Douglas Kiehl; Eric S. McLamore, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?