Activity recognition and monitoring play important roles in smart home systems, which rely on perceiving and understanding inhabitants' behavior to generate a response. Applications that require such pattern recognition include alarms to detect anomalies and personalized home automation systems (where automatic lighting, heating, or security are tailored to the habits of the inhabitants).
To observe activity in an environment by the least intrusive means, we considered the use of ambient sensors, rather than cameras or wearables. Most existing work in this area focuses on environments with single inhabitants,1 or on laboratories or temporarily occupied spaces,2 and is conducted using high-density motion sensors.3 We aim to provide a solution that includes tools for tracking, recognizing, and monitoring the day-to-day activities of inhabitant(s), but which is also suitable for real-life settings, under a strict policy that we discuss and agree with the inhabitants.
Our testbed was a 260m2 home in Carinthia, Austria, based on a framework similar to that of the Casa Vecchia project.4 A family of four inhabited the house: parents, aged 46 and 45, and two children, aged 10 and 13, and there were also regular visitors. Around the house we installed passive IR motion sensors, magnetic contacts for doors and cupboards, in-between switches, and window blind actuators (see Figure 1). All the devices (except the blind actuators) generated binary values (on/off, open/closed) when triggered by their interactions with the inhabitants.
Figure 1. Left: Floor plan of the testbed house and sensor placements for the ambient sensor network. Right: Examples of the sensors: a motion sensor on the wall (top), a magnetic contact on a shower door (center), and an in-between switch for the TV (bottom).
For privacy reasons, we were unable to use video cameras for observation, and for comfort and practicality we selected not to use identification devices (radio-frequency identification bracelets, for example). Therefore, to recognize and track movement by each person as they move around the space, we plan to extend an existing graph-based tracking method2 by incorporating individual walking speed and daily movement patterns. We would then implement different graphical probabilistic models to recognize activities, such as coupled or parallel hidden Markov models (stochastic models where the system is assumed to have unobserved states) and conditional random fields (statistical modeling for identifying patterns). In a multi-inhabitant setting, people perform their activities either individually in parallel with each other, or jointly as a group. Most previous studies have involved two inhabitants,5–7 but we plan to analyze the accuracy and complexity of the models when applied to a greater number of inhabitants.
Providing the ground truth for the sensor data (i.e., the context in which the data was collected at the site) is an important step for evaluating our developed tools and algorithms. We experimented with two different annotation methods to label the data with the person who triggered each event and the activities of each person. The first was a self-reporting approach using a simple mobile application. We found that many activities either went missing (30–70% for each person) or were recorded with inaccurate time stamps. In the second approach, each person wore a camera that automatically took pictures of the environment every 30 seconds, resulting in image sequences of first-person camera views. An annotator then labeled the sensor data by browsing through the images. Although more time and effort were required for this approach, the resulting information was richer and closer to the ground truth than for self-reporting.
We have also developed several visualization interfaces for monitoring, based on the sensor data alone, and these offer significant insight into people's behavior. With simple charts (see Figure 2), we can obtain information about the likely time of day when an activity is performed, and whether it changes during other days of the week. For example, Figure 2 shows that the inhabitants usually watch TV only in the evening on weekdays, but throughout the day at weekends. With a longer observation period we could also observe how activities might change according to the seasons, or the increasing age of the inhabitants.
Figure 2. Visualization of TV usage during September–November 2013. The lower graph shows the total usage per day and the vertical graph shows the usage distribution per hour.
We are currently developing the tracking and activity recognition algorithms using our annotated data set. We plan to make the data available to the research community in the near future, and furthermore we will perform usability studies on the visualization interfaces. Our goal is to provide a functional framework for activity recognition and monitoring that can be implemented in real home environments.
This work was supported in part by the Erasmus Mundus Joint Doctorate in Interactive and Cognitive Environments, which is funded by the Education, Audiovisual and Culture Executive Agency of the European Commission under EMJD ICE FPA 2010-0012.
Chitra Ayuningtyas, Gerhard Leitner, Martin Hitz
Institute of Informatics Systems
University of Klagenfurt Alpen-Adria
Mathias Funk, Jun Hu, Matthias Rauterberg
Department of Industrial Design
Eindhoven University of Technology
1. T Van Kasteren, A. Noulas, G. Englebienne, B. Kröse, Accurate activity recognition in a home setting, Proc. 10th Int'l Conf. Ubiq. Comp., p. 1-9, 2008.
2. D. Cook, M. Schmitter-Edgecombe, A. Crandall, C. Sanders, B. Thomas, Collecting and disseminating smart home sensor data in the CASAS project, Proc. Comp. Hum. Inter. Conf., p. 4763-4766, 2009.
3. A. Crandall, D. Cook, Tracking systems for multiple smart home residents, Human Behavior Recognition Technologies: Intelligent Applications for Monitoring and Security, p. 111-129, IGI Global, 2010.
4. G. Leitner, A. Felfernig, A. J. Fercher, M. Hitz, Disseminating ambient assisted living in rural areas, Sensors 14(8), p. 13496-531, 2014.
5. G. Singla, D. Cook, M. Schmitter-Edgecombe, Recognizing independent and joint activities among multiple residents in smart environments, J. Amb. Intel. Hum. Comp. 1(1), p. 57-63, 2010.
6. Y. Chiang, K. Hsu, C. Lu, L. Fu, J. Hsu, Interaction models for multiple-resident activity recognition in a smart home, Proc. IEEE/RSJ Int'l Conf. Intel. Rob. Syst., p. 3753-3758, 2010.
7. C. Tunca, H. Alemdar, H. Ertan, O. D. Incel, C. Ersoy, Multimodal wireless sensor network-based ambient assisted living in real homes with multiple residents, Sensors 14(6), p. 9692-719, 2014.