Smart Sensing

From oemagazine October 2004
01 October 2004
By Andrew Scott

The term smart optics has recently been coined to mean optical technologies with a dynamic functionality that enhances their performance. In this article we will describe an example that allows users to extract specific information from a light beam and use it in an intelligent way in areas such as manufacturing and metrology. Our group has developed a new type of smart sensor that has the ability to produce multiple images simultaneously, and we can use this to generate detailed information about an incident wavefront.1 With a small modification to a key component and different processing techniques, it can characterize Gaussian beams to determine M2 and similar parameters.2 The device has a potential role in metrology applications such as surface shape analysis, laser beam profiling, and adaptive optics.

The sensor is based on a novel diffraction grating called an image multiplex (IMP) grating, made up of arcs of circles rather than straight lines. The device is essentially an off-axis Fresnel zone plate, but is more conveniently regarded as a diffraction grating with a parabolic chirp (i.e., the pitch and period of the grating are proportional to q2, where q is the distance from the axis to a given point on the grating). The mark-space ratio is 1:1, so when light is incident on the grating almost all of the transmitted light is diffracted into the three lowest orders (m = -1, 0, and +1). We can fabricate the IMP grating as an amplitude grating (alternating opaque and transparent lines) or as a phase grating (all transparent lines with alternating optical thickness). By choosing an appropriate phase step, we can control the ratio of power diffracted into the 0 and ±1 orders. For an incident beam significantly larger than the grating pitch, we can characterize the grating as a lens with focal length forder = fg/m, where m = -1, 0, +1 for the different diffracted orders and fg is the grating focal length, defined simply as the effective focal length for light diffracted into the m = 1 order.


Figure 1. A grating combined with a short-focal-length lens produces three images, at the +1, 0, and -1 orders. If a collimated beam is directed onto a detector at the focus, the system forms a focused spot (corresponding to m = 0) and two defocused spots (corresponding to m = -1, +1); the two defocused spots correspond to images of the beam at planes 1 and 2.

The most effective way to use this grating is to combine it with another short-focal-length lens (see figure 1). The combined focal length of the grating/lens pair becomes approximately fm = f0 - mf02/fg, where f0 is the focal length of the lens and m = -1, 0, +1 (for the case f0<<fg). When light from an object is directed at the grating/lens combination, three images are formed. If the camera is at the focal plane of the short-focal-length lens, the zero order image will be in focus, whereas the ±1 order images will be out of focus by equal and opposite amounts. We can show that the defocused spots are, in fact, images of two planes placed symmetrically about the lens at a distance fg (labeled in the figure as 1 and 2).

Wavefront Analysis

We can use this optical arrangement in two quite different ways. Roddier showed that the intensity distribution at two planes can provide a means of measuring wavefront curvature.3 Consider a beam of light traveling from left to right, with which we are illustrating an example in which the wavefront is concave at the top and convex at the bottom. As the wavefront propagates, the "rays" travel in a direction perpendicular to the wavefront, so a concave wavefront results in converging rays. As the concave part converges, that portion of the beam becomes more intense after propagation (plane 2) than before propagation (plane 1). Conversely, the convex portion of the wavefront diverges as it travels, so that the beam is less intense after propagation than before.

The beam propagates in the z direction and the wavefront is defined in the r = (x,y) plane. The relationship between the intensity I(r) of the two planes is relatively simple, given by the wave equations. As the beam propagates, the curvature of the phase ∇2Φ(r) is related to the derivative of the intensity by

    [1]

where λ is wavelength. We can estimate the value of dI(r)/dz by measuring the intensity at planes 1 and 2, and the derivative by (I(r,z2)-I(r,z1))/(z2-z1). The equation defines Φ(r) in terms of the derivative of intensity. If we can then solve the equation, we obtain a value for the phase of the wavefront, Φ(r).

Our group has developed a solution to the equation by using a Greens function approach.4 The tricky part is calculating the Greens function, and once this has been computed (as a string of numbers since the solution is not obtained analytically) the solution to the equation is given by

    [2]

In the case of pixelated images, the integration in equation 2 becomes a simple matrix multiplication. Using the two images in figure 1, we can determine the wavefront. We take two sub-images that correspond to the intensities at plane 1 and plane 2, then subtract them and carry out the matrix multiplication to determine the wavefront. Our current system records images on a 40 x 40 pixel patch and uses the recorded images, together with a pre-calculated Greens function matrix of 404 numbers, to determine the wavefront as a set of 40 x 40 phase values. We can represent these images as a simple phase map, or we can resolve them into a set of Zernike polynomials. After the zero mode (piston shift), the lowest order Zernike modes consist of the tip and tilt values, followed by radius of curvature or focus. These parameters give immediate information on the alignment of any beam, enabling a user to leverage the device as an alignment tool.


Figure 2. Measured versus applied aberration for various Zernike modes shows good measurement accuracy for smart sensors where the rms wavefront error varies from λ/25 to λ/43 (Note: The aberration range is from -5λ to 5λ for all except spherical aberration, which is from -2λ to 2λ. Lines have also been translated vertically for clarity and to avoid overlap). We applied aberration using a programmable diffraction grating.

We validated this concept by applying a series of controlled aberrations to an optical beam, and then comparing the measured aberration with the applied value; the experimental results showed good agreement with the applied aberration (see figure 2).5

Beam Characterization

We also can use the sensor for beam characterization. We note first that the system forms three images, each generated by a different effective lens. If we place a camera at one image plane, the system forms focused images of three separate object planes. Even if the objects are located along the optic axis, the three images will be separated and are available for separate inspection (see figure 3).


Figure 3. If a detector is placed at the focus of the short-focal-length lens, it provides three images, which correspond to the three focal lengths of the orders generated by the lens/grating combination. If the three objects are coaxial, then the off-axis nature of the grating/lens ensures that the three images are separated and distinct. 
 

Figure 4. Image of nine separate laser beam profiles measured simultaneously using crossed IMP gratings can be used to measure the Gaussian beam width. We generated this data using amplitude gratings in which only 4% of the power is diffracted into the ±1 orders. A phase grating can control the fraction of total power diffracted into the zero and first orders.

We can exploit this optical arrangement in a quite different way compared with the wavefront sensing above. First, note that if we combine two gratings in orthogonal directions, we can form images of nine planes. In the case of a weakly focused incident laser beam, for example, the lens/IMP grating combination will form an image of the waist at the zero order for both gratings, and the other orders will form images of the laser beam at different points away from the waist.

If we direct a laser beam onto such a sensor, we obtain intensity profiles of the incident laser beam, measured at nine different planes (see figure 4). Measuring the Gaussian beam diameter for each beam (or, more rigorously, the second moment of the beam) and plotting this versus the position of the respective image plane allows us to determine the value of M 2 (see figure 5).


Figure 5. Plot of beam width versus grating power for a typical Nd:YAG laser yields the M 2 value for the beam. The multi-plane sensor measurement yielded a value of 1.3 compared with a conventional M 2 measurement of 1.25 based on a scanning sensor.

The advantage of this sensor is that it can make a measurement of M 2 in a single frame, allowing single-shot characterization of pulsed lasers, or dynamic monitoring of continuous-wave lasers during warm-up. We generated the data shown in figure 5 using an amplitude grating and most of the power remains in the zero order. Alternatively, we could have used a phase grating, which would have allowed us to control the distribution of powers between the different orders. In this way we could have ensured that the central zero-order profile of the beam waist was similar in peak brightness to the profile of the larger beam away from the waist. However, such phase gratings were not available to us at that time.

We have found the beam characterization and wavefront curvature sensing modes of the sensor to be complementary. The sensor acts as an effective tool for quantifying the quality of a laser beam and the degradation caused by any optical imperfection in the subsequent optical system.

The device can also measure the imperfections in an optical surface, performing a role equivalent to that of an optical interferometer. We first collimate the output from a diode laser and direct it to an optical surface. Analysis of the reflected light yields the shape of the wavefront, which matches the shape of the reflecting surface. Equivalently, we can use the system in transmission to characterize the aberrations of windows and other optical components.

Adaptive Optics

Within QinetiQ, we are exploring other potential applications and commercialization. The processing can be extremely fast, requiring only the time to record a patch from a camera and then carry out the subtraction and a single matrix multiplication. This makes it worth considering as an alternative to a Shack-Hartman wavefront sensor in adaptive optics applications. In early experiments, we incorporated this wavefront curvature sensor into a camera operating at 670 frames/s. The data was fed into a Pentium 2 computer, and we used the setup to drive an adaptive optics system.6 We have now built an adaptive optics system with a 2-kframe/s camera and a dedicated digital signal processor, and are currently characterizing the performance.

It is worth making a brief comparison with the Shack-Hartman sensor, which is commonly used for wavefront sensing. The Shack-Hartman sensor is prone to alignment error, and is susceptible to error when scintillation is strong. It can also be used in beam profiling, generating the classical M 2 based on certain propagation assumptions.

In contrast, the curvature sensor does not require careful alignment and is robust, even in the presence of scintillation. The smart sensor does not detect dislocations (also known as branch points, optical vortices, etc.), however. These are associated with scintillation and occur when interference causes the intensity to have a value of exactly zero at a point on the wavefront. This zero intensity point is equivalent to a dislocation on a crystal lattice. As you measure the phase going around this point, the phase will increase (or decrease) by a factor of 2π in one cycle. This problem is usually only relevant to adaptive optics, and is currently of academic rather than practical interest, as the deformable mirrors in adaptive optics systems are usually continuous and unable to deal with the problem if detected.

Another drawback of this technique is that the focal length of the grating is proportional to wavelength, and we need to know the wavelength to calibrate the measurement. There are limits to the technique—it fails for very broadband or for femtosecond lasers, in which the dispersion due to spectral bandwidth may produce a broadening that is comparable to, or larger than, the physical diameter of the beam.

This new type of smart sensor can measure the shape of an optical wavefront, or can measure laser beam parameters. It provides a powerful method in laser assessment and alignment, and provides a diagnostic tool for measuring optical surfaces and transmissive components. oe

References

P. Blanchard and A. Greenaway, Applied Optics 38[32], p. 6692 (1999).

A. Scott and S. Woods, Proc. SPIE 4629 (2002).

F. Roddier, Applied Optics 27[7], p.1223 (1988).

S. Woods and A. Greenaway, J. Opt. Soc. Am. A 20[3], p. 508 (2003).

P. Blanchard et al., Applied Optics 39[35], p. 6649 (2000).

J. Burnett et al., "Wavefront measurement over an extended horizontal path using a wavefront curvature sensor," to be published in the Proceedings of the 4th International Workshop on Adaptive Optics for Industry and Medicine, Muenster, Germany (2003).


Andrew Scott
Andrew Scott is business manager and team leader for free space optical systems at QinetiQ, Malvern, UK.

Recent News
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research