Dispersion of light rays is what gives rainbows their colorful and awe-inspiring beauty. And to an equal but opposite extent, it gives optical designers sleepless nights. The dispersion of light's different wavelengths is one of the most challenging hurdles to overcome when designing an optical system.
When light of different wavelengths passes through a medium, it is strongly affected by the medium's physical properties. In designing lenses, the optical engineer must find a solution that prevents light from separating into different wavelengths and thus producing an unsatisfactory image. Design of such achromatic lenses is especially difficult for multispectral imaging, i.e., the interpretation of images produced under a wide variety of light conditions, including at night. These lenses—and their associated light detectors, which together form an optical system—are designed to operate mainly in the IR portion of the electromagnetic spectrum. But in any low-light environment, some visible light is almost always present. Handling a combination of visible and IR light is a formidable challenge for optical systems.
An analogy may be helpful in understanding the difficulty involved in designing an achromatic optical system that encompasses both IR and visible light. Let us take a look at just the visible-light portion of the spectrum. Consider a group of runners at a cross-country track meet. These runners have identical abilities: when conditions are the same for all, they run at exactly the same speed. Each runner wears a jersey representing one color of the visible spectrum.
At the starting line, the runners position themselves in the same order as the spectrum, from the shortest (blue) to the longest wavelength (red). If all conditions on the course are equal, all runners will cross the finish line simultaneously. However, this is no ordinary race. The topography of the course is steepest at blue's position and decreases in difficulty with each ‘color’ that follows. As the race begins, each runner begins with the same inherent speed. But the varying terrain of the course slows blue's progress dramatically compared with the other colors. He soon falls behind the pack. The other colors also begin to spread out, with red in the lead. Red crosses the finish line first and blue is last.
This analogy describes the condition that light experiences when it travels through a refractive medium. The difference in the course's complexity which impacts the runner's speed can be likened to the overall ‘bending’ of light when it emerges from the transparent material.1
How does an optical designer correct for this phenomenon? Let us go back to the analogy above. Now, however, we will add a second race course that follows immediately upon the first but is exactly the reverse in difficulty. In other words, red now has the steepest incline and blue the shallowest. The net result of the new addition is that blue is able to slowly catch up with red, and all runners will cross the finish line at the same moment.
So, the optical designer must design a system in which all wavelengths of visible light ‘cross the finish line’ at the same time. But it is more difficult than the analogy suggests, because no optical ‘course’ really slopes linearly from blue to red, or vice versa. Instead, it has various undulations in linearity between the two extremes. In imaging, this means that median wavelengths—those between blue and red—end up defocused. This is referred to as ‘secondary spectrum,’ and it can be one of the most difficult problems to control in broadband optics.
Figure 1. Nonlinear behavior of zinc selenide in light transmission from visible to far-IR wavelengths. MWIR, LWIR: Mid-, long-wavelength IR.
The degree to which this nonlinearity affects chromatic aberration is proportional to the width of the spectral band that is accommodated. Figure 1 depicts the properties of zinc selenide, a yellowish semiconductor material that transmits light across a wide band of wavelengths.2 The variation in refractive index (light's speed on the ‘course’) indicates the large difference in slope over the different areas of the material. To compensate for the speed at which light moves through this material, a designer would need to find an alternate material with similar refractive profile to build his second, compensating course. If the region of interest is fairly gentle in slope, this is easily achievable. For example, finding a pairing over the mid- or long-wavelength IR sections of the spectrum would be fairly simple as long as we did not venture into the visible regime. However, that is seldom possible, since visible light is almost always part of the equation.
As surveillance technology advances, optical systems need to contend with ever-wider sections of the electromagnetic spectrum. For that reason, multispectral lenses are the heart of some of the most sophisticated optical systems in use today. Modern optical systems enable continuous day/night operation while offering greater image clarity for threat detection, personnel identification, and surveillance applications.
Designers faced with limited choices of lens materials—and the nonlinear dispersive behavior of those that are available—can find the task of pairing two spectral ranges daunting. One solution to that problem is to use different optics for different spectral bands. However, that is frequently not a practical option, because the added volume and weight of a dual system cannot be accommodated. Unmanned-vehicle systems, for example, typically have a frustratingly small amount of space available for an optical system.
Why not consider using a reflective system rather than lenses? Since mirrors are impervious to color aberrations, they would, at first glance, seem to offer the most logical approach for an optical system. If a system covers an angular field of view of only a few degrees or has a lot of light to work with, then the reflective approach may indeed make the most sense.3 Most sensor systems, however, require wide fields of view and must operate at low light levels. In those situations, only lens-based systems will do. The holy grail for lens designers has, therefore, been an optical design that can accommodate a multitude of light detectors simultaneously. Such a system would significantly reduce space and weight requirements in a vehicle's payload. The same advantages would carry over to a multitude of other applications, such as hand-held and head-mounted sensors.
To address this need, we have developed such an optical solution. In doing so, we have departed from the older approach of randomly testing materials for suitable pairings in favor of a more rigorous mathematical approach. This way, we have systematized what tends to be a ‘needle-in-a-haystack’ search for optimal materials. Our engineers have developed a patented system dubbed the SuperBand™ lens for achromatic imaging over spectral bands extending from visible to far-IR wavelengths to address the need for super-spectral achromatized optics.
Figure 2. SuperBand concept in a dual-detector configuration. SWIR: Short-wave IR.
The system uses a dual-detector arrangement (see Figure 2). The two independent detectors are separated by a beam splitter that forms separate images from short- and long-wavelength IR rays. This optical system can be interfaced very simply to broadband receivers to deliver crisp, clear images in whatever light is available. Figure 3 demonstrates the system's performance. This particular version of the design can be outfitted for any spectral range extending from 550nm to the far-IR designated boundary of 12μm. Figure 4 shows the lens.
Figure 3. RMS spot size (red line) at increasing wavelengths for the 100mm F/5 SuperBand concept. F: Focal ratio. Spot-size values represent the degree of image blur. The blue line represents the diffraction-limited value, the best a system can physically achieve. Any rms spot size on or below that line is considered perfect optics.
Figure 4. StingRay 100mm F/2.5 SuperBand lens.
The need for ever more sophisticated imaging systems is increasing at a rate that requires bold, creative solutions that have, until recently, been widely regarded as unobtainable. Today, those solutions are being achieved. The challenge now for optical designers is to step up their creativity even further to keep pace with the demands of technology. Technical challenges that our company is working on include the design of motorized zoom lenses for broadband imaging. We will be closely involved in further development of advanced multispectral imaging systems and in their application to the imaging requirements of unmanned vehicles, tracking systems, and remote sensing.
StingRay Optics LLC
Christopher Alexay founded StingRay Optics in August 2004 and has established the company as a recognized supplier of custom optical systems. Prior to this, he was a director of Janos Technology Inc.