Imagery analysis: standards needed for the drone age
Imagine that aerial imagery gathered from an unmanned aerial vehicle (UAV) sensor shows you at the flashpoint of a demonstration gone violent. You are now facing criminal charges, and key to your guilt or innocence is IR imagery depicting some nearby bright flashes. You know that they were sun reflections off broken glass, but prosecutors contend that they are the IR signatures of Molotov cocktails. Their expert (who has never seen Molotov signatures on IR before) has calculated that sun reflections from glass were too weak to be seen by the camera. Your expert concluded otherwise, but you are more than a little apprehensive as you realize that your fate now hinges on scientific statements made in a venue where no standards to judge their quality exist.
The above scenario is less far-fetched than it sounds. It is the result of combining two conditions—the expected proliferation of UAVs in America's skies1 and the lack of standards to judge the quality of an analysis—with some recent history, namely, the controversy over the origin of bright flashes appearing on forward-looking IR (FLIR) imagery captured during the 1993 standoff between religious sect members and federal agents at Waco, TX. (By federal agents I mean individuals acting on behalf of the government. No specific agency affiliation should be assumed.) The dispute's primary technical issue was whether these flashes represented gunfire or some other phenomenon, such as solar reflections from debris. By examining issues pertinent to this controversy, we can gain insight into what is required to improve the quality of imagery analysis.
My approach to the task borrowed from my background in radiometric calibration. Like calibration, analysis is a process. Component parts—facts, in the analysis case—are arranged by the analyst in a logical order and lead to a result. Each statement that an analyst takes as fact contributes an uncertainty to the outcome. By examining significant statements from analysts who had worked on the flash problem,2–6 I developed error categories. These categories allowed me a closer look at what analysts thought to be the facts behind their statements. I also documented how analysts obtained their facts, with observation, modeling, testing, and literature searches among the methods.
The error categories I identified were: errors resulting from an incorrect calculation or model; errors embedded within analysis logic; and errors arising from an analyst's lack of knowledge about a key phenomenon. An example of the first error type is the statement that sun reflections do not show up on thermal IR imagery. While this error in fact resulted from incomplete modeling, it was also contradicted by the data. Figure 1 shows a metal post, circled, from which a strong glint may be seen in Video 1.7 Even if we did not have the photo for reference, the close alignment of the long shadow from the tower with our viewing direction is a strong indicator of sunglint.

Uncovering the facts behind an embedded error required a few more steps. According to several government analysts, persons had to be viewable near an IR flash for the flash to be a gunfire signature.8, 9 Embedded within this proposition are the supporting facts that the thermal and spatial resolution of the IR video were sufficient to allow persons to be seen. Video 2 refutes these underlying assumptions, showing that persons were viewable only when the thermal contrast is great enough to allow it.10 The video also exhibits insufficient spatial resolution to see individuals who had stopped moving. Figure 2 depicts conditions near a flash in which the likelihood of seeing persons is even lower than we see on the video. I sought to decouple this error from analysis logic by selecting the reporting format shown in Table 1, which prominently features key facts and information on how they were obtained.

Number | Fact | How determined? |
1 | Sun reflections may be seen in the IR images | By observation; this correlated with calculations |
2.1 | No people are seen near the flashes | By observation |
2.2 | Spatial and thermal resolution were sufficient to view persons near flashes | By observation |
3 | Gunfire flashes do not last long enough to have generated the flashes on FLIR | Literature survey |
The most obvious source of error in the Waco controversy was analysts' lack of experience with the primary phenomenon, gunfire, and I created a separate category called ‘phenomenology error’ to address it. I suggested that the practice of retaining discipline specialists—standard in audio and video authentication today—be extended to subject matter in which the analyst has no experience. For most Waco analysts, these subjects included muzzle flash phenomenology, the differences between sniper rifles and combat arms, military versus commercial ammunition, flash suppressants, and environmental influences on flash signatures. The specialist's judgment on the issue would appear in the report with his/her name and credentials documented in the ‘How determined?’ column.
My work thus far11 has focused on breaking down statements used in analysis into their component facts, and developing a reporting format that includes the methods used to obtain them. I am now working on improvements, including developing metrics allowing uncertainties to be quantified. For example, there is less uncertainty in actually observing a sunglint than in calculating that one may be observed. Yet simply adopting reporting standards like those suggested will constitute a step forward. We stand poised on the brink of a new era in imagery collection. We need to be ready with standards for imagery analysis.
Barbara G. Grant, SPIE senior member, earned her master's in optical sciences from the University of Arizona. She has more than 30 years' professional engineering experience encompassing areas such as radiometric calibration, electro-optical systems, and imagery analysis. She is the author of Field Guide to Radiometry, co-author of The Art of Radiometry, and is working on a book on UAV imaging sensors. She teaches classes to professionals through SPIE, the University of California Irvine Extension, and Georgia Tech Professional Education.