SPIE Startup Challenge 2015 Founding Partner - JENOPTIK Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:

SPIE Photonics West 2017 | Register Today

SPIE Defense + Commercial Sensing 2017 | Call for Papers

Get Down (loaded) - SPIE Journals OPEN ACCESS


Print PageEmail PageView PDF

Sensing & Measurement

Passive three-dimensional sensing by polarimetric imaging

A single sensor can be used to extract 3D information from a scene by computing the orientation angles of the emitting surface normal vector.
17 November 2006, SPIE Newsroom. DOI: 10.1117/2.1200611.0434

In the rapidly growing fields of computer vision,1,2 remote sensing, and automatic target recognition, much research is focussed on obtaining an adequate perception of the real 3D world to gain insight in the locations and physical characteristics of various objects. For a variety of reasons, cost and security being the most important, the selected sensors are typically passive imaging sensors that only provide a 2D view of the world.

Consequently, passive 3D sensing has been the object of a significant research effort during the past decades with the goal of exploring how to obtain 3D information from 2D imagery. This includes use of multiple 2D passive sensors, positioned at different geographical locations in a scene, followed by triangulation to determine range information. Alternatively, a moving 2D sensor is can be, and by exploiting motion parallax or optical flow, 3D information obtained. Though effective in many situations, these techniques suffer from a number of shortcomings. For example, the multiple sensor approach has the stringent requirement that very accurate registration be made among the sensors. This can prove difficult to satisfy. As for the motion parallax approach, it requires a dense optical flow field to determine range accurately, also hard to achieve.

In our work, we have explored a method for obtaining 3D information based on the use of a passive infrared polarimetric imaging sensor.1,3,4 Unlike passive sensors operating in the visible range of the electromagnetic spectrum, passive infrared sensors (PIRs) measure emitted and reflected radiation in the 3–5 and 8–14μm ranges. They are, accordingly, very useful at night and under low visibility conditions. With these sensors, the orientations of the surface patches in a scene are determined from the state of the polarization vectors obtainable at each output pixel.

The polarimetric imaging sensor used in our work captures the intensity outputs of four distinctly oriented polarimetric filters that allow (emitted) light at 0°, 45°, 90°and, 135°orientations. The first three components of the Stokes vector that mathematically describes the state of the polarization at each pixel in the imagery are then obtained. From the Stokes parameters, two other factors are defined that play direct roles in determining surface orientation in a scene. The first is the angle of polarization (or polarization ellipse tilt angle), ψ , defined as half the arc tangent of the ratio of the third and second Stokes parameters. The second factor is the percent of linear polarization, r , defined as the ratio of the sum of the second and third Stokes parameters to the sum of the first, second and third parameters.

Refraction polarization and Fresnel equations

Thermal polarization can be viewed as a special case of refraction polarization. In this case, electromagnetic (EM) waves emanating from an object impinge on its interface with air, refract and travel through the atmosphere and are intercepted by an IR sensor.

If we define the surface of the interface patch as s, the orientation of the vector normal to this surface can be described by two angles: namely the angle between the sensor and the normal to the interface patch Φ and the azimuth angle of the plane of incidence. This plane is composed of the line of sight from the sensor to a point on s, and the normal to the surface at that point. Upon impinging on the interface, a Fresnel unpolarized EM wave is reflected and refracted. The refracted wave is polarized in a direction perpendicular to the plane of incidence and the reflected wave is polarized in a direction parallel to the plane of incidence. These two components are related to two angles: the angle of the EM wave that impinges on s and the angle of incidence, Φ. These relationships are referred to as the Fresnel equations.5 Now the previously defined percent of polarization ( r ) can be shown to be the ratio of the difference to the sum of these two parallel and perpendicular intensity components.

Using these equations and Snell's law that relates the angles of incidence and refraction for a wave impinging on an interface between two media with different indices of refraction, two analytical solutions can be obtained for Φ in terms of percent of linear polarization and index of refraction. Figure 1 shows a plot of one of the solutions (Φ1), with the range of variation of n set from 1.001 to 10. However, n is usually a complex number. Its imaginary part, called the extinction coefficient, is indicative of the absorption loss when the EM wave propagates through the material.

Figure 1. A plot of one of the analytical solutions obtained for surface normal depression angles as a function of the relative index of refraction of the emitting object (n) and the percent of linear polarization (r).

To completely describe the surface normal vector, in addition to the angle Φ, the azimuth angle of the plane of incidence is also required. This angle is the same angle as the polarization tilt angle, ψ. Thus, the surface normal in a scene can be completely specified when ψ and Φ are known.

Experimental results

Using a physics-based IR modeling tool,1 we generated polarimetric images of a tactical scene in which several aircraft hangars were connected by runways and the target placed on the grass. In one set of images (Figures 2 and 3), the target was a M35 truck. For the polarimetric sensors, we included the characteristics of a mid-band infrared (3-5μm) indium antimonide focal plane array. We then made the following simplifying assumptions: all surfaces had the same temperature; only two surface materials existed, namely grass and glossy paint; and no sun was present.

Figure 2. The azimuth angle of the surface normal is shown for a scene containing a military truck placed near a runway.
Figure 3. The depression angle of the surface normal is shown for a scene containing a military truck placed near a runway. In this case, the Φ1 solution was used.

The range from sensor to target varied from 5km to 12km. For each scenario, r,ψ, and the two resulting Φ angle images were computed for a range of refraction indices between 1.001 to 10. Figures 2 and 3 show ψ, and one of the two resulting Φ angle imagery for relatively large r values (> 0.4) for the M35 truck at a distance of 5km. To assess the sensitivity of the depression angle Φ with respect to the index of refraction, the Φ-angles of six different surface patches were computed for the M35 truck and also for a T72 tank. For the surface patches of the truck object, the Φ angles showed that all three patches had similar values. However, their corresponding ψ angles were not identical, indicative of the three patches coming from different portions of the curved surface. The data also showed that the estimated Φ values had standard deviations of around 6.5° for the tank patches, compared to 6.8° for the truck patches. The variations decreased significantly at larger values of n. In our experiment, n was varied by equal increments from 1.001 to 10. In real world scenarios however, the variations of n are fairly small, and consequently the Φ angle variations would be much smaller than what we observed. Thus our approach could provide reasonable estimates of actual surface-normal depression angles.

In another experiment, we used a polarimetric long-wave (8–14μm) mercury cadmium telluride IR imaging sensor to obtain Φ and ψ angle imagery of a rectangular plate (Figure 4) in a laboratory environment. Figure 5 shows the depression angle image of the surface normals. Different colored patches on the surface of the plate indicate the presence of different indices of refractions (due to different target paints). The actual values of these indices can be computed directly from these depression angles and their corresponding percent of polarization values.

Figure 4. The large plate used in the experiment is shown in the lower part of the figure under a smaller block.  
Figure 5. Shown is the Φ image of plate used in our experiment. Different intensity values (colors) indicate the presence of different indices of refraction (different paints/material).  

By direct measurements of the depression angles of the surface normals, we obtained estimates of the indices of refraction of the various paints on the surface of the plate: useful data for their classification.


The problem of extracting 3D information from a single polarimetric imaging sensor can be addressed by computing the orientation angles of the emitting surface normal vector. Using the Fresnel equations and Snell's law, the depression angle of this vector can be analytically derived. Our experimental results show that computing these angles for a range of indices of refraction can yield a reasonable estimate of the object's 3D surface normal orientation angles. Conversely, if the depression angles are known, we can compute the indices of refraction of the emitting surfaces, which allows their classification.

Firooz Sadjadi
Lockheed Martin Corporation
Eagan, MN

Firooz Sadjadi is a member of the technical staff at Lockheed Martin Corporation. He received his BSEE from Purdue University in 1972, his MSEE in 1974, and his DEE in 1976 from the University of Southern California. His interests are in theoretical and experimental research related to signal and image processing, pattern recognition, target tracking and information fusion. He is an author and editor of many publications, including a forthcoming book entitled Physics o f Automatic Target Recognition to be published by Springer in 2006. He is a Fellow of the International Society for Optical Engineering (SPIE), and a member of Sigma Xi, IEEE and OSA. In addition, he has been chairman of the Automatic Target Recognition Conference for the past 16 years and has written numerous papers for this and other SPIE conferences. He is also the founder and chairman of the ATR Technical Working Group. He has edited three books published by SPIE Press and has been guest editor for two special issues of Optical Engineering

2. L.B. Wolff, T.E.B. Boult, Constraining object features using a polarization reflectance model,
IEEE Trans. Pattern Analysis and Machine Intel. 13,
no. 7, 1991.
4. S. Huard,
Polarization of Light.,
John Wiley & Sons, 1996.
5. M. Born, E. Wolf,
Principles of Optics,
Cambridge, 1998.