Control of unmanned aerial vehicles using insect optical sensors

Insect-inspired optical sensors provide effective stabilization, compassing, and navigation.
08 March 2011
Javaan S. Chahl and Akiko Mizutani

The key principles of insect vision have been deduced over decades of biological research.1,2 A theme that has emerged is the reliance of insects on the spatial and temporal distribution of light for flight control. Computational foundations and control laws derived from insects3 have become increasingly well formulated. In the same period, unmanned aerial vehicles (UAVs) have emerged as a revolution in air power. However, UAVs are not as autonomous as we would like. Unlike insects, they depend on artificial sources for position information and do not operate well close to obstacles. As we attempt their miniaturization, scaling problems emerge beyond the reduced fidelity of small sensors. For example, the vibration of flapping wings degrades inertial sensing, while densely packed avionics results in distorted measurements of the earth's magnetic field.

Insects have overcome these problems using a highly integrated visuomotor system. They use the polarization pattern in the sky for heading, stabilize using the horizon as a visual reference, and navigate over terrain using vision. Many of the reflexes of insects only function as part of an ensemble of reflexes, analogous to the biologically inspired subsumption architecture proposed by Brooks.4

The first task of any UAV is stabilization. Agile airframes must be actively stabilized by their flight-control systems. We have reverse engineered the basic function of ocelli—an auxiliary visual system found in most insects—and tested their performance as part of the flight-control system of the UAV shown in Figure 2. Ocelli are composed of simple eyes, as distinct from compound eyes (see Figure 1). The ocelli of dragonflies have wide fields of view and a retina containing UV and green photoreceptors. They are anatomically adapted for sensing attitude and provide a primary input to the insect's flight-control system.5 The reflected UV intensity off the ground is much less than that from the sky. By balancing left and right visual-field intensities, dragonflies can hold their heads level. The sun's intensity would be a major disruption if it fell within the visual field of an ocellus. A spectral-opponency technique that includes input from green photoreceptors in the light-balancing algorithm makes it possible to eliminate the sun's effect under many conditions. The inset in Figure 2 shows a device containing engineered ocelli, which proved capable of holding the unstable airframe level over long trajectories in a wide variety of environments.


Figure 1. Head of a dragonfly, showing the dorsal-rim area and the lateral and median ocelli in relation to the compound eye.

Figure 2. The unmanned aerial vehicle (UAV) on which we tested some of our sensors. It requires active control to fly level. (inset) The short lens holders oriented horizontally are the engineered ocelli. The long vertically mounted lens holders comprise two polarimeters that emulate the dorsal-rim area of the compound eye.

Many flying insects use the sky's polarization pattern cast by Rayleigh scattering6 of sun- or moonlight to measure direction.7 This pattern is tangential to the light source at every point. Its distributed nature allows use of a partial view of the sky for navigation. The ‘dorsal-rim’ area of the compound eye (see Figure 1) is often sensitive to polarization. It is configured to function as a sun compass. In flight, the automatic polarimeter (see inset in Figure 2) showed good agreement with conventional avionics-bearing sensors (see Figure 3).


Figure 3. The red and green traces show, respectively, the computed polarization (Pol) heading and that based on using inertial and magnetic sensors.

Navigation requires more than holding a heading and maintaining an upright attitude. The ground must be avoided and cross wind must be compensated for by adjusting heading. We used optical flow—a sense that insects use extensively—for flight control.8 Optical-flow sensors measure the angular motion of the environment projected onto a focal plane. A moving platform observes the relative motion of the world around it as patterns of optical flow. Range in any direction can be computed given a speed and an optical-flow measurement. By calculating the optical-flow direction, we can determine the relative direction of motion. Cross wind causes an aircraft to fly a ground course that differs from its compass bearing. A flight path can be compensated using this measure, so that the sum of the compass course and the optical-flow angle results in the desired course. We implemented course-correction behavior using a single sensor from an optical computer mouse. The ground track was maintained to within 1° over an 800m trajectory (see Figure 4).


Figure 4. The red and green traces show, respectively, a course that was corrected for lateral drift using optical flow and a flight immediately afterwards in which the optical-flow system was turned off. In both cases, the UAV was commanded to fly grid North.

In summary, we demonstrated substantial autonomy with an array of simple sensors in a simple environment. Insects use optical flow over the entire visual field for flight control. They have a distributed view of the polarization pattern above and below the horizon. A comprehensive sensor suite emulating the insect optical and neural system would provide the means to implement these behaviors more robustly in complex environments. We continue to work toward this goal.

The authors thank NASA and the Defense Advanced Research Projects Agency for supporting aspects of this work. Ongoing work is funded by the Australian Defence Science and Technology Organisation Corporate Enabling Research Program.


Javaan S. Chahl
Defence Science and Technology Organisation
Edinburgh, Australia

Javaan Chahl received a Bachelor of Engineering degree in 1991 and completed his doctorate at the Australian National University in 1996. He now leads the corporate research program on UAVs.

Akiko Mizutani
Odonatrix Pty. Ltd.
One Tree Hill, Australia

Akiko Mizutani completed her doctorate at Kyushu University (Japan) in 1996. She is the chief executive officer of Odonatrix, a defense research company.


References:
1. K. Von Frisch, Tanzsprache und Orientierung der Bienen, Springer-Verlag, 1965.
2. B. Hassenstein, W. Reichardt, System theoretische Analyse der Zeit, Reihenfolgen, und Vorzeichenauswertung bei der Bewegungsperzeption des Rüsselkäfers Chlorophanus, Zeitschr. Naturforsch. 116, pp. 513-524, 1956.
3. M. V. Srinivasan, S. W. Zhang, M. Lehrer, T. S. Collett, Honeybee navigation en route to the goal: visual flight control and odometry, J. Exp. Biol. 199, pp. 237-244, 1996.
4. R. A. Brooks, Elephants don't play chess, Robot. Auton. Syst. 6, pp. 3-15, 1990.
5. G. Stange, J. Howard, An ocellar dorsal light response in a dragonfly, J. Exp. Biol. 83, pp. 351-355, 1979.
6. J. W. Strutt (Lord Rayleigh), On the light from the sky, its polarisation and colour, Philos. Mag. 41, pp. 107-120, 274–279, 1871.
7. G. Harváth, D. Varjú, Polarized Light in Animal Vision: Polarization Patterns in Nature, Springer-Verlag, 2004.
8. M. V. Srinivasan, Visual motor computations in insects, Annu. Rev. Neurosci. 27, pp. 679-696, 2004.
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research