Stingray: a fast small unmanned ground vehicle for urban combat

Immersive telepresence and driver-assist behaviors increase the maximum effective speed of semiautonomous mobile robots.
24 September 2008
Brian Yamauchi and Kent Massey

Small unmanned ground vehicles (UGVs) such as the iRobot PackBot have revolutionized the way soldiers deal with improvised explosive devices (IEDs). These man-portable UGVs allow soldiers to inspect and destroy suspected IEDs from a safe distance. A typical UGV transmits video from an onboard camera back to the operator control unit (OCU) that displays the video on a computer screen. In a manner similar to playing a first-person shooter video game, the operator teleoperates the UGV using a joystick, gamepad, or other input device to control vehicle motion. While this teleoperation method works well at slow speeds in simple environments, viewing the world through a fixed camera limits the operator's situational awareness about the surroundings.

Even joystick-controlled cameras that pan and tilt can be distracting to work while driving the vehicle. This is one of the reasons why small UGVs have been limited to traveling at slow speeds (typically less than 8kph) and missions like IED inspection, where these speeds are tolerable.

Faster small UGVs would be useful in a wide range of military operations. When an infantry squad storms a building held by insurgents, speed is essential to maintain the advantage of surprise. When a dismounted infantry unit patrols a city on foot, the soldiers need a UGV that can keep up. However, driving at high speeds through complex urban environments is difficult for any vehicle, and small UGVs face additional challenges. They need to steer around obstacles that a larger vehicle could drive over. A bump that would be absorbed by a large vehicle's suspension can send a small, fast-moving UGV flying into the air.

For the Stingray Project,1 funded by the US Army Tank-Automotive Research, Development and Engineering Center (TARDEC), iRobot Corporation and Chatten Associates are developing technologies that will enable teleoperation of small UGVs at high speeds through urban terrain. Our approach combines immersive telepresence, which gives the operator the impression of being in the vehicle, along with semiautonomous driver-assist behaviors, which command the vehicle to safely maneuver according to the driver's intent. These behaviors are similar to modern fighter aircraft, which are designed to be dynamically unstable and can only be controlled through software that interprets the pilot's control inputs.


Figure 1. iRobot Warrior UGV with a Chatten Head-Aimed Remote Viewer (HARV).

In phase 1 of the Stingray Project, we mounted a Chatten Head-Aimed Remote Viewer (HARV) on an iRobot Warrior UGV prototype (see Figure 1) and a surrogate small UGV based on a high-speed, gas-powered, radio-controlled car platform (see Figure 2). The operator wears a head-mounted display and a head tracker (see Figure 3). The display shows the video from the HARV's camera, which is mounted on a pan/tilt/roll gimbal. The HARV tracks the operator's head position and turns the camera to face in the same direction. This provides a fully immersive experience, which studies have shown can improve mission performance by 200–400%.2 Using the HARV, we teleoperated the Warrior through urban and wooded terrain at the prototype's current top speed, 10kph, and the surrogate UGV through a slalom course at speeds of up to 30–45kph (see video3).


Figure 2. Surrogate UGV driving through slalom course.

Figure 3. Operator driving vehicle using Chatten HARV.

For phase 2 of Stingray, we will increase the Warrior UGV's top speed by developing a high-speed, wheeled version of the vehicle (see Figure 4) with a top speed of 32kph. To assist the driver in controlling this speed, we will reuse the street-following and perimeter-following behaviors developed for the TARDEC-funded Wayfarer Project. These behaviors employ light detection and ranging (LIDAR) to determine the orientation of features such as street boundaries, building walls, and tree lines. During Wayfarer, we demonstrated that the performance of this system is robust to irregularities in both urban and rural terrain.4


Figure 4. High-speed Stingray UGV.

Small vehicles driving rapidly over rough terrain will spend much of their time in the air. To assist in vehicle control, we will develop behaviors to autonomously maintain an operator-specified heading, despite perturbations caused by intermittent contact with the underlying terrain. This will free the operator from having to make constant steering adjustments to keep the vehicle driving in a straight line. We will also enhance situational awareness by adding a panoramic view inset, a rear-view inset, and haptic feedback to the HARV. These devices will provide constant awareness of obstacles and threats in all directions. Haptic feedback will allow the operator to feel collisions and sense terrain characteristics (e.g., smooth, rough, slippery) through the steering wheel controller.

Stingray technology can be applied to both military and civilian vehicles. For mainstream consumers, driver-assist behaviors will make driving safer and more convenient: the car will become the driver. For driving enthusiasts, technology will allow the driver to become fully immersed in the driving experience with unobstructed 360-degree views and enhanced haptic and force feedback: the driver will become the car.

This material is based upon work supported by the US Army Tankautomotive and Armaments Command (TACOM) Life Cycle Management Command (LCMC) under contract W56HZV-08-C-0079. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the US Army TACOM LCMC.


Brian Yamauchi
iRobot Corporation
Bedford, MA

Brian Yamauchi is a lead roboticist with iRobot Corporation's Research Group and the principal investigator for the Stingray Project. His research interests include mobile robot navigation and mapping, autonomous vehicle control, and multirobot coordination. Prior to joining iRobot, he conducted robotics research at the Naval Research Laboratory, the Jet Propulsion Laboratory, NASA's Kennedy Space Center, and the Institute for the Study of Learning and Expertise.

Kent Massey
Chatten Associates
West Conshohocken, PA

Kent Massey is the chief operating officer for Chatten Associates and leads the company's Robotics Development Group. Previously, at GE Aerospace, he was the chief systems engineer on the NASA Space Station science laboratory module. He is the principal investigator for a US Army Armament Research, Development and Engineering Center-funded project to develop a ruggedized HARV.


Recent News
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research