Border and perimeter sensing is key to addressing concerns about national security and criminal activity. In addition, due to the vastness of national borders and the human resources needed to patrol even modest-sized areas, detection systems must be automated. Finally, for environmental and operational reasons, sensor systems must be able to communicate at low power or bandwidth, and to keep false alarms to a minimum.
Simple pyroelectric motion detectors, such as those used in automatic light and home security systems, meet many of these requirements but produce too many false alarms. Imaging sensors provide a high level of discrimination, but they typically require human interaction as well as transmission of the image, which is costly in terms of bandwidth. We have been pursuing an approach that lies somewhere in between motion detectors and imaging sensors. Using a small and sparse array of detectors, we sense a minimal set of features from a moving object that enables reliable discrimination of humans, animals, and vehicles.
Our approach originated with a concept put forward by Ron Sartain of the US Army Research Laboratory.1 He reasoned that useful information about traffic along a path could be derived simply by determining the profile or silhouette of the objects that traverse it. It was hoped as well that critical discrimination could be obtained automatically. Prior efforts had shown that a great deal of information was available from shape alone.2 To investigate this idea further, we constructed a simple sensor consisting of two vertical posts populated with a sparse array of near-IR transmitters and receivers3,4 (see Figure 1) that produces a series of parallel near-IR beams between the posts. As an object passes through, each part of it breaks one of the beams. Over time, the output of all the transmitters and receivers forms a crude image representing the profile of the object. This device was used to collect profiles of people, animals, and other objects. Several straightforward classification algorithms were then trained, tested, and evaluated using the data. We achieved better than 95% correct classification of our major categories.
Figure 1. Near-IR beam-break profiling sensor (left) and example profile (right).
The near-IR beam prototype proved the concept of a profiling sensor, but it has many drawbacks in terms of deployment. We decided on a passive-IR approach as a viable alternative.5–7 Pyroelectric-IR (PIR) devices were an obvious choice for detectors because they need no cryogenic cooling and exist in both single-device and array form. To help find the best configuration, we developed a detailed video simulation that produces output from arbitrary configurations of PIR arrays. We also used the simulation to account for the effects of lenses or mirrors for focusing. Figure 2 presents an example output.
Figure 2. Long-wave-IR image (left) and simulated output of PIR linear array (right) of a donkey.
Figure 3. Prototype PIR-profiling sensor.
We are currently testing a prototype sensor constructed by the US Army Night Vision and Electronics Sensors Directorate that consists of a linear array of PIR detectors, a lens, and a microcontroller (see Figure 3). This sensor is being used to gather data for testing and training classification algorithms. Figure 4 shows sample output from the device.
Figure 4. Example output from the IR-profiling sensor. An IR image of a man running (top left), a PIR profile of a man running (top right), an IR image of a man walking (bottom left), and a PIR profile of a man walking (bottom right). The vertical line in the IR pictures shows the position of the PIR linear array. (Images courtesy of the US Army Night Vision and Electronic Sensors Directorate.)
Designing such algorithms requires identifying features from the information collected, and then training and testing them on data sets. We have found that simple characteristics and Bayesian classifiers perform remarkably well for human, animal, and vehicle discrimination.8 For example, the maximum height and width of the profile combined with a naive Bayesian classifier typically yield correct membership probabilities greater than 95%, consistent with the results from the near IR-beam device.
We plan to conduct operational testing of a prototype PIR profiling sensor in 2010. We are also examining the possibility of different kinds of ‘profiles.’ For instance, certain PIR detectors are configured as differential pairs that detect motion in a preferred direction (along the axis of the two detectors). Sparse arrays of these pairs could be used to extract motion characteristics from objects of interest. We are investigating the use of these features for classifying moving objects.
The authors gratefully acknowledge the support of the US Army Research Laboratory and the US Army Night Vision and Electronics Sensors Directorate in the performance of this research.
Eddie Jacobs, Srikant Chari, David Russomanno, Carl Halford
Department of Electrical and Computer Engineering
The University of Memphis
Eddie Jacobs is an assistant professor. He obtained his DSc from George Washington University in 2001. His research focuses on imaging, image understanding, and image-making devices.
Srikant Chari received his PhD in electrical engineering from the University of Memphis (2007), where he is currently a research assistant professor. His research interests include machine vision, image fusion, and metrics for algorithm performance.
David Russomanno is R. Eugene Smith Professor and chair of the Department of Electrical and Computer Engineering at the University of Memphis. He obtained a PhD in computer engineering from the University of South Carolina in 1993. His research focuses on intelligent sensors and knowledge representation for the semantic web.
Carl Halford is director of the Center for Advanced Sensors and R. Eugene Smith Professor of Electrical and Computer Engineering. He obtained his PhD from the University of Arkansas in 1970. He carries out research mainly in the areas of image-processing enhancements for sensors, novel sensor architectures, IR sensors, and intelligence, surveillance, and reconnaissance sensors.