21 - 25 April 2024
National Harbor, Maryland, US
New and emerging cameras rely heavily on image processing for improved performance. Modeling, associated metrics, and testing need significant upgrading to meet these new challenges. Novel imaging processing may not guarantee task performance improvement. Emerging targets of all sizes need to be characterized in all relevant wavelength bands. How can we leverage new metrics in decision making? Cameras consist of two major components: the sensor (optics + detector array) and the image processing subsystem. Each component must be modeled and tested as well as the combination. Here, the sensor output (test points) must be available to the test engineer.

Papers are solicited for the following areas: This conference focusses on overall system level modeling and testing. Submitted papers devoted exclusively to algorithm design will be referred to other conferences.

 

AWARDS
This year, this conference grants three awards: 1) Best presentation is selected by the conference committee; 2) Santa Barbara Infrared (SBIR) is offering a $250.00 honorarium for an innovation award, granted for the most innovative test technique; and 3) True Colors Infrared Imaging (TCII) is awarding a $250.00 honorarium to the most innovative modeling and simulation driven imager design or analysis paper presented by a student or young researcher. All awardees will receive recognition on their published paper in the conference proceedings, provided that the manuscript is submitted before the end of the conference.

Innovation Awards sponsored by:

AND

;
In progress – view active session
Conference 13045

Infrared Imaging Systems: Design, Analysis, Modeling, and Testing XXXV

23 - 25 April 2024 | National Harbor 3 (Tues-Wed) and National Harbor 6 (Thurs)
Show conference sponsors + Hide conference sponsors
View Session ∨
  • Symposium Panel on Microelectronics Commercial Crossover
  • Welcome and Opening Remarks
  • 1: Infrared Imaging Systems I
  • 2: Infrared Imaging Systems II
  • Poster Session
  • Symposium Plenary on AI/ML + Sustainability
  • 3: Infrared Imaging Systems III
  • Round-Table Discussion: Current Modeling Limitations
  • 4: Infrared Imaging Systems IV
  • 5: Infrared Imaging Systems V
  • 6: Infrared Imaging Systems VI
  • Round-Table Discussion: Future Testing Requirements
  • 7: Infrared Imaging Systems VII
Information

This conference grants three awards: 1) Best presentation is selected by the conference committee; 2) Santa Barbara Infrared (SBIR) is offering a $250.00 honorarium for an innovation award, granted for the most innovative test technique; and 3) True Colors Infrared Imaging (TCII) is awarding a $250.00 honorarium to the most innovative modeling and simulation-driven imager design or analysis paper presented by a student or young researcher. All awardees will receive recognition header on their published paper in the conference proceedings, provided that the manuscript is submitted before the end of the conference. See the Awards page for full details.

Symposium Panel on Microelectronics Commercial Crossover
23 April 2024 • 8:30 AM - 10:00 AM EDT | Potomac A

View Full Details: spie.org/dcs/symposium-panel

The CHIPS Act Microelectronics Commons network is accelerating the pace of microelectronics technology development in the U.S. This panel discussion will explore opportunities for crossover from commercial technology into DoD systems and applications, discussing what emerging commercial microelectronics technologies could be most impactful on photonics and sensors and how the DoD might best leverage commercial innovations in microelectronics.

13165-601
Author(s): John M. Pellegrino, Georgia Tech Research Institute (retired) (United States); Shamik Das, The MITRE Corp. (United States); Erin Gawron-Hyla, U.S. Dept. of Defense (United States); Carl E. McCants, Defense Advanced Research Projects Agency (United States); Kyle D. Squires, Arizona State Univ. (United States); Anil Rao, Intel Corp. (United States)
23 April 2024 • 8:30 AM - 10:00 AM EDT | Potomac A
Show Abstract + Hide Abstract
The CHIPS Act Microelectronics Commons network is accelerating the pace of microelectronics technology development in the U.S. This panel discussion will explore opportunities for crossover from commercial technology into DoD systems and applications. Experts representing the Microelectronics Commons program, government R&D, commercial industry, DoD industry, and academia will discuss what emerging commercial microelectronics technologies could be most impactful on photonics and sensors and how the DoD might best leverage commercial innovations in microelectronics.
Welcome and Opening Remarks
23 April 2024 • 1:30 PM - 1:40 PM EDT | National Harbor 3
David P. Haefner, DEVCOM C5ISR (United States); Gerald C. Holst, JCD Publishing (United States)
Session 1: Infrared Imaging Systems I
23 April 2024 • 1:40 PM - 3:10 PM EDT | National Harbor 3
Session Chairs: Daniel A. LeMaster, U.S. Dept. of Transportation (United States), Ronald G. Driggers, Wyant College of Optical Sciences (United States)
13045-1
Author(s): Austin A. Richards, Oculus Photonics LLP (United States)
23 April 2024 • 1:40 PM - 2:10 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
This invited talk is a survey of the author's experiences over 40 years of imaging in low-light environments, starting with image converter tubes and image intensifiers and transitioning to silicon sensors. The talk will give an overview of the best-practices lessons learned around system design, sensor comparison, data collection, active target illumination, scene irradiance measurements and the generation of ambient lighting environments in controlled laboratory spaces.
13045-2
Author(s): David P. Haefner, DEVCOM C5ISR (United States); Aaron Hendrickson, Naval Air Warfare Ctr. Aircraft Div. (United States); Austin A. Richards, Oculus Photonics LLP (United States); Martin Hübner, HENSOLDT Optronics GmbH (Germany); Bradley L. Preece, Brian Teaney, Stephen D. Burks, DEVCOM C5ISR (United States)
23 April 2024 • 2:10 PM - 2:30 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
In 2023, Richards and Hübner proposed silux [ ] as a new standard unit of irradiance for the full 350-1100 [nm] band, specifically addressing the mismatch between the photopic response of the human eye and spectral sensitivity of new low-light, Silicon, CMOS sensors with enhanced NIR response. In this correspondence, we demonstrate a per-pixel calibration of a camera to create the first imaging siluxmeter. To do this, we developed a comprehensive per-pixel model as well as the experimental and data reduction methods to estimate the parameters. These parameters are then combined to an updated NVIPM measured system component that now provides the conversion factor from device units of DN to silux, lux, and other radiometric units. Additionally, the accuracy of the measurements and modeling are assessed through comparisons to field observations and validating/transferring calibration from one low light camera to another. Following this process, other low-light cameras can be calibrated and applied to scenes such that they may be accurately characterized using silux as the standard unit.
13045-3
Author(s): Lindsey Wiley, Adam Katheder, Joshua Follansbee, Wyant College of Optical Sciences, The Univ. of Arizona (United States); Jeff Voss, Richard E. Pimpinella, Sivananthan Labs., Inc. (United States); Ronald G. Driggers, Wyant College of Optical Sciences, The Univ. of Arizona (United States)
23 April 2024 • 2:30 PM - 2:50 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
Interest in the eSWIR band is growing due to focal plane array technology advancements with mercury cadmium telluride and type-II superlattice materials. As design and fabrication processes improve, eSWIR detector size, weight, and power can now be optimized. For some applications, it is desirable to have a smaller detector size. Reduced solar illumination in the 2 to 2.5 μm spectral range creates a fundamental limit to passive imaging performance in the eSWIR band where the resolution benefit of small detectors cannot out-compete the reduced SNR in photon-starved environments. This research explores the underlying theory using signal-to-noise ratio radiometry and modeled target discrimination performance to assess the optimal detector size for eSWIR dependent upon illumination conditions. Finally, we model continuous-wave laser illumination in the eSWIR band to compare the effect of detector size on active and passive imaging for long-range object discrimination.
13045-4
Author(s): Jennifer Hewitt, C. Kyle Renshaw, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States); Ronald G. Driggers, Wyant College of Optical Sciences, The Univ. of Arizona (United States)
23 April 2024 • 2:50 PM - 3:10 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
Time-limited search modeling has been an important aspect of sensor design for over two decades. In past work, we introduced a model which incorporated camera matrix theory into a pre-existing time-limited model for moving sensor scenarios, for the purpose of optimizing sensor orientation for a given platform speed and height. During the introduction of this model, it was established that optimization in this way required the determination of a balance between sensor range to target and time on target. In this study, we further explore the capabilities of this new model by optimizing sensor configuration in a few selected scenarios, with focus in how sensor orientation, platform speed, and platform height interact with one another.
Break
Coffee Break 3:10 PM - 3:40 PM
Session 2: Infrared Imaging Systems II
23 April 2024 • 3:40 PM - 5:20 PM EDT | National Harbor 3
Session Chairs: Ronald G. Driggers, Wyant College of Optical Sciences (United States), Daniel A. LeMaster, U.S. Dept. of Transportation (United States)
13045-5
Author(s): Shane Jordan, Ronald G. Driggers, Wyant College of Optical Sciences, The Univ. of Arizona (United States); Orges Furxhi, True Colors Infrared Imaging (United States); Patrick Leslie, The Univ. of Arizona (United States); Richard C. Cavanaugh, Wyant College of Optical Sciences, The Univ. of Arizona (United States); C. Kyle Renshaw, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States); Eddie L. Jacobs, The Univ. of Memphis (United States)
23 April 2024 • 3:40 PM - 4:00 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
One of the primary activities in emissive infrared imager design is the trade on whether to use midwave infrared or longwave infrared in the application. Applications including target acquisition (both target search and target identification), threat warning, aircraft detection and pilotage have performance dependent upon the background scene contrast. This study discusses the differences observed in scene contrast between midwave infrared and longwave infrared bands. Scene contrast comparisons are provided under different conditions such as rural versus urban and day versus night. This comparison enables the infrared system designer with the means to perform detailed engineering trades.
13045-6
Author(s): David P. Haefner, DEVCOM C5ISR (United States); Aaron Hendrickson, U.S. Navy (NAWCAD) (United States); Stephen D. Burks, DEVCOM C5ISR (United States)
23 April 2024 • 4:00 PM - 4:20 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
This manuscript presents a systematic approach for developing new measurements and evaluation techniques through modeling and simulation. A proposed sequence of steps is outlined, starting with defining the desired measurable(s), going through model development and exploration, conducting experiments, and publishing results. This framework, based on the scientific method, provides a structured process for creating robust, well-defined measurement procedures before experiments are performed. The approach is demonstrated through a case study on measuring camera-to-display system latency. A simulation tool is described that enables exploration of how different experimental parameters like camera temporal response, display properties, and source characteristics impact the measurement and associated uncertainties. Several examples illustrate using the tool to establish notional guidelines for optimizing the experimental design. The simulation-driven process aims to increase confidence in new measurement techniques by incrementally refining models, identifying assumptions, and evaluating potential error sources prior to costly physical implementation.
13045-7
Author(s): Joshua Follansbee, Eric Mitchell, The Univ. of Arizona (United States); C. Kyle Renshaw, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States); Ronald G. Driggers, Wyant College of Optical Sciences, The Univ. of Arizona (United States)
23 April 2024 • 4:20 PM - 4:40 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
Coherent illumination of an optically rough surface creates random phase variations in the reflected electric field, which propagation converts into amplitude fluctuations known as speckle in both the pupil and image planes. Infrared imaging systems are often parameterized by the quantity Fλ/d, which relates the cutoff frequencies passed by the optical diffraction MTF to the frequencies passed by the detector MTF. We present a Monte-Carlo analysis using wave-optics simulations in order to determine the relationship of image-plane speckle variance to the first-order system parameters utilized in Fλ/d (focal length, aperture size, wavelength, and detector size). For designers of active imaging systems, this paper provides guidance for selecting a value of Fλ/d that reduces speckle variance in a sampled image while maintaining acceptable resolution.
13045-8
Author(s): Mark Martino, Li Zhang, Jeremy W. Mares, Alex Irwin, Oles Fylypiv, Eunmo Kang, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States); Ronald G. Driggers, Wyant College of Optical Sciences, The Univ. of Arizona (United States); C. Kyle Renshaw, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States)
23 April 2024 • 4:40 PM - 5:00 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
Computer vision has become crucial to autonomous systems, helping them navigate complex environments. Combining this with geospatial data further provides capability to geolocate the system when GPS is not available or trusted. A test bed was built to characterize the visibility of radio and cellular towers from a ground-vehicle across all atmospheric transmission bands. These targets are exemplary features because of their visibility over long distances and surveyed geolocation. Contrast measurements of targets were characterized and compared in each spectral window under different environmental conditions. Utilizing human perception to build NVIPM models provided predictable range performance for each band.
13045-9
Author(s): Robert E. Short, Leonardo DRS (United States)
23 April 2024 • 5:00 PM - 5:20 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
The use of a synthetic observer model has shown promise for range performance analysis of novel imaging systems. This has many advantages over traditional analytical range models, chiefly stemming from the fact that it determines performance from (real or simulated) imagery directly, rather than from a pre-specified list of parameters. Our synthetic observer approach operates over a Triangle Orientation Discrimination (TOD) target and observer task, using a template correlator for target identification. The synthetic observer performance is taken as a proxy for human target identification performance, enabling expedient evaluation of image processing pipelines, sensor configurations, environmental conditions, etc. In prior work we have explored how the template-correlator-based synthetic observer performs on flat background, flat target imagery. In this work, we apply the same synthetic observer design to natural backgrounds. Performance is compared to that of human observers on the same perception task.
Poster Session
23 April 2024 • 6:00 PM - 7:30 PM EDT | Potomac C
Conference attendees are invited to attend the symposium-wide poster session on Tuesday evening. Come view the SPIE DCS posters, enjoy light refreshments, ask questions, and network with colleagues in your field. Authors of poster papers will be present to answer questions concerning their papers. Attendees are required to wear their conference registration badges to the poster session.

Poster Setup: Tuesday 12:00 PM - 5:30 PM
Poster authors, view poster presentation guidelines and set-up instructions at http://spie.org/DCSPosterGuidelines.
13045-37
Author(s): Jesse Duncan, Troy B. Mayo, Scott Ramsey, Samuel G. Lambrakos, U.S. Naval Research Lab. (United States)
On demand | Presented live 23 April 2024
Show Abstract + Hide Abstract
This study examines topological dependence of diffuse reflectance for IR absorbing materials. A theoretical foundation for this functional dependence is described, elucidating physical processes underlying topological dependence of IR diffuse reflectance of composite-material layers on substrates. The dependence is examined by case studies using composite-material layers that include IR absorbing dyes on fabric substrates. Understanding the topological dependence of diffuse reflectance can assist in determining optimal composite-material configurations for specific reflectance specifications, which can include UV protecting materials.
13045-38
Author(s): Eunmo Kang, Oles Fylypiv, Jeremy W. Mares, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States); Joshua Follansbee, Ron G. Driggers, Wyant College of Optical Sciences, The Univ. of Arizona (United States); C. Kyle Renshaw, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States)
On demand | Presented live 23 April 2024
Show Abstract + Hide Abstract
This paper compares passive imaging and active imaging for long-range target tracking in Near-IR verse SWIR bands. Passive imaging uses direct sunlight as an illumination source. For active imaging, we investigate continuous wave (CW) and laser range-gated (LRG) illumination during both day and night operations. LRG illumination provides temporal controls to reduce atmospheric backscatter and distant background in order to maximize contrast to noise ratio (CNR). This study compares experimental data collected over propagation distances up to 1km against radiometric models implemented analytically and numerical modelling implemented in the Night Vision Integrated Performance Model (NV-IPM).
13045-39
Author(s): Felipe A. Borcoski, Sergio N. Torres, Joaquin A. Lermanda, Univ. de Concepción (Chile)
On demand | Presented live 23 April 2024
Show Abstract + Hide Abstract
In this work, I will share the main results achieved with a Long Wave Infrared (LWIR) Light Field (LF) imaging system with two novel capabilities relevant to IR image science applications: The capability of digitally refocusing to any nearby object planes with a high Signal to Noise Ratio (SNR), this is, to achieve refocused image object planes almost free of Fixed-Pattern Noise (FPN) and blur artifacts. And, the capability of achieving multispectral LWIR imaging for the global scene and for all the refocused nearby object planes required, this is, LWIR radiometry refocusing capacity. The built-in LWIR LF imaging system is implemented with an LWIR microbolometer Xenics camera 8-12 micrometers spectral band, and a high precision scanning system (Newport). LWIR multispectral capacity is achieved with an array of narrow-band LWIR interference optical filters.
Symposium Plenary on AI/ML + Sustainability
24 April 2024 • 8:30 AM - 10:00 AM EDT | Potomac A
Session Chairs: Latasha Solomon, DEVCOM Army Research Lab. (United States), Ann Marie Raynal, Sandia National Labs. (United States)

Welcome and opening remarks
24 April 2024 • 8:30 AM - 8:40 AM EDT

13165-503
Author(s): David J. Pierce, U.S. Army Intelligence and Security Command (United States)
24 April 2024 • 8:40 AM - 9:20 AM EDT | Potomac A
13165-504
Author(s): Anuradha M. Agarwal, Massachusetts Institute of Technology (United States)
24 April 2024 • 9:20 AM - 10:00 AM EDT | Potomac A
Break
Coffee Break 10:00 AM - 10:30 AM
Session 3: Infrared Imaging Systems III
24 April 2024 • 10:30 AM - 12:40 PM EDT | National Harbor 3
Session Chairs: John W. Devitt, RTX Corp. (United States), Richard L. Espinola, U.S. Naval Research Lab. (United States)
13045-10
Author(s): Katrin Braesicke, Fraunhofer-Institut für Optronik, Systemtechnik und Bildauswertung IOSB (Germany)
24 April 2024 • 10:30 AM - 11:00 AM EDT | National Harbor 3
Show Abstract + Hide Abstract
This paper provides an overview of the development of different models to determine the range performance of infrared imaging systems. It starts with the grassroots of the motivation of these models to be able to compare the detection, recognition and identification ranges of different infrared imaging systems. With the development of these imaging systems further progress of the performance models were needed and will be described. The rapidly evolving complexity of imaging systems leads to a more divers approach to the comparison of these new systems. I will supply some examples to conquer the new challenges in the development of image enhancement procedures.
13045-11
Author(s): Luke D. Somerville, Wyant College of Optical Sciences, The Univ. of Arizona (United States); Patrick Leslie, Shane Jordan, Wyant College of Optical Sciences (United States); C. Kyle Renshaw, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States); Ronald G. Driggers, Wyant College of Optical Sciences, The Univ. of Arizona (United States)
24 April 2024 • 11:00 AM - 11:20 AM EDT | National Harbor 3
Show Abstract + Hide Abstract
New detector technology is allowing for detection of extended and even bridged wavebands. These bridged waveband, or superband, detectors have response over large spectral spans, allowing them to take advantage of the unique properties of multiple wavebands. This type of system is especially of interest when the superband contains both the short-wave infrared (SWIR) waveband – where most of the signal comes from reflected light – and mid-wave infrared (MWIR) waveband – where most of the signal comes from emitted light. Such a superband system allows the combination of reflected and emitted light on a single detector, opening new system level optical design trades across many fields and disciplines. Presented is a comparison of reflected and emitted radiometric signal levels for four filtered wavebands using a 1.5 to 5.4 μm superband imaging system: (1) with a 1.9 μm SWIR shortpass filter, (2) with a 2 to 2.5 μm extended SWIR (eSWIR) bandpass filter, (3) with a 3 μm MWIR longpass filter, and (4) with no filter (i.e., full superband response). The comparison in each of the four wavebands is repeated under four solar illumination conditions: full daylight, clouds, dusk, and night.
13045-12
Author(s): Gregor Franz, Fraunhofer-Institut für Optronik, Systemtechnik und Bildauswertung IOSB (Germany); Daniel Wegner, Marten Wiehn, Fraunhofer-Institut für Optronik (Germany); Stefan Keßler, Fraunhofer-Institut für Optronik, Systemtechnik und Bildauswertung IOSB (Germany)
24 April 2024 • 11:20 AM - 11:40 AM EDT | National Harbor 3
Show Abstract + Hide Abstract
For various applications, such as handheld imaging systems and cameras mounted on vehicles or flying platforms, unavoidable motion and vibrations of the camera may result in smeared or blurred images or shaky videos. This study aims at evaluating metrics for the measurement of camera vibrations in image sequences considering triangles and bars as test patterns. The focus are objective metrics for video stabilization, which are designed to objectively evaluate whether video stabilization was able to eliminate objectionable visual movement. The metrics are evaluated for simulated image sequences captured by an artificially moved camera. The sequences vary in different properties such as the sensor noise of the camera, as well as the and temporal frequency of the camera vibrations. We analyze the effect of these properties on the metrics behavior. First results using recorded data of thermal imagers are presented as well. The findings will provide insights into the efficacy of different video stabilization metrics on simulated sequences with varying properties.
13045-13
Author(s): Eddie L. Jacobs, The Univ. of Memphis (United States); C. Kyle Renshaw, Univ. of Central Florida (United States); Ronald G. Driggers, The Univ. of Arizona (United States); Orges Furxhi, The Univ. of Memphis (United States); Joseph Conroy, DEVCOM Army Research Laboratory (United States)
24 April 2024 • 11:40 AM - 12:00 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
Traditional targeting tasks consist of detection, recognition, and identification (DRI). Increasingly, sensing systems are being asked to go beyond these traditional categories for the purpose of distinguishing targets from decoys. The difficulty of this task is dependent both on the sensing system used and the fidelity of the decoy. In this paper we examine how the task of distinguishing target from decoy with imaging sensors fits within the traditional task difficulty description in models such as the Night Vision Integrated Performance Model (NVIPM). We discuss the types of decoys an imaging sensor might encounter. We introduce the idea of interrogation as a task. Using NVIPM and the tracked vehicle target identification task as a baseline, we examine the space of task difficulty for possible insights into the task difficulty of interrogation for imaging sensors. Examining several sensors spanning visible through thermal infrared, we calculate the performance as a function of task difficulty. From this, we discuss the implications and possible limitations of using imaging sensors for interrogation.
13045-14
Author(s): Daniel Wegner, Stefan Keßler, Fraunhofer-Institut für Optronik, Systemtechnik und Bildauswertung IOSB (Germany)
24 April 2024 • 12:00 PM - 12:20 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
Thermal imaging systems with aided or automatic target recognition (ATR) are becoming increasingly important in military systems. For the performance evaluating of such thermal imagers, classifier models for triangle orientation discrimination (TOD) have been proposed. In contrast to the standard TOD sensor test and performance prediction, which include a human observer or a human visual system (HVS) model that mimics the observer, classifier models consider algorithms as the prime consumer of imager data. So far it is an open question to which extent the TOD task ̶ a classification problem with four classes ̶ is suitable for providing similar evaluations and rankings for thermal imaging devices as algorithms for more complex tasks like object detection, e.g. for ATR. A widely used framework for object detection is “You Only Look Once” (YOLO). In this work, performance assessments using TOD classifier and YOLO-based models in terms of model accuracies are compared. Images from databases as well as synthetic images with triangles and natural backgrounds are degraded according to a unified device description with different levels of blur and imager noise
13045-15
Author(s): Bradley L. Preece, DEVCOM C5ISR (United States)
24 April 2024 • 12:20 PM - 12:40 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
A Jones Detectivity, denoted D*, metric is commonly used to compare thermal camera focal plane arrays. D* projects the thermal noise back into time (1 second) and area (1cm^2), thereby normalizing its bandwidth. This makes it easier to compare the sensitivity for different thermal detectors. Here we extend the basic idea of this bandwidth normalization to low light cameras, by using a signal to noise ratio (SNR), denoted SNR_(D^* ). One SNR_(D^* ) goal is to compare the performance of the low light sensor in the darkest of conditions, and therefore a dark version is defined using the absolute noise floor of the camera. The signal and noise are normalized by projecting it back to the scene (through the optics) to an angular space. It is argued that projecting the SNR back to the scene makes it capable of comparing complete low light camera systems, including the lens. We also explore the SNR defined and specified by image intensifier tubes, and show why it is not a good prediction for the performance of low light cameras.
Break
Lunch/Exhibition Break 12:40 PM - 2:00 PM
Round-Table Discussion: Current Modeling Limitations
24 April 2024 • 2:00 PM - 3:00 PM EDT | National Harbor 3

Moderator:
Michael Soel, Teledyne FLIR Systems, Inc. (United States)

Panelists:
Ronald Driggers, The Univ. of Arizona (United States)
Orges Furxhi, True Colors Infrared Imaging (United States)
Brian P. Teaney, DEVCOM C5ISR (United States)
Daniel Wegner, Fraunhofer-Institut für Optronik, Systemtechnik und Bildauswertung IOSB (Germany)

Break
Coffee Break 3:00 PM - 3:30 PM
Session 4: Infrared Imaging Systems IV
24 April 2024 • 3:30 PM - 5:30 PM EDT | National Harbor 3
Session Chairs: Richard L. Espinola, U.S. Naval Research Lab. (United States), John W. Devitt, RTX Corp. (United States)
13045-16
Author(s): Jonathan G. Hixson, Brian Teaney, Michael Finch, DEVCOM C5ISR (United States); Georges Nehmetallah, The Catholic University of America (United States); Ronald Driggers, The University of Arizona (United States)
24 April 2024 • 3:30 PM - 3:50 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
This paper will take an initial look at the effect of variations in a sensor’s Fλ/d metric value (FLD) on the performance of Yolo_v3 (You Only Look Once) algorithm for object classification. The Yolo_v3 algorithm will initially be trained using static imagery provided in the commonly available Advanced Driver Assist System (ADAS) dataset. Image processing techniques will then be used to degrade image quality of the test data set, simulating detector-limited to optics-limited performance of the imagery. The degraded test set will then be used to evaluate the performance of Yolo_v3 for object classification. Results of Yolo_v3 will be presented for the varying levels of image degradation. An initial summary of the results will be discussed along with recommendations for evaluating an algorithm’s performance using a sensors FLD metric value
13045-17
Author(s): Thomas P. Watson, Apratim Dasgupta, Daniel Foti, Eddie L. Jacobs, The Univ. of Memphis (United States)
24 April 2024 • 3:50 PM - 4:10 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
Optical turbulence in the atmosphere causes defocus, blur, and wander of images captured over long distances, which can significantly degrade their quality. Turbulence is a manifestation of variations in the index of refraction, which are caused by local variations in air temperature, pressure, humidity, gas content, and other factors. Turbulence can be quantified by the refractive index structure function parameter Cn2. Simulation of images after propagation through an atmosphere of a specific Cn2, along with measurement of the observed Cn2 from images, is thus of interest for a variety of agricultural, environmental, and defense applications. We discuss the generation of simulated imagery after propagation through an atmosphere of a defined Cn2 using various algorithms, then examine methods to determine the observed Cn2 from the generated images. Finally, we choose and test an algorithm to generate images and another to estimate Cn2, then compare and contrast the observed Cn2 to the defined Cn2 in each case to observe how the simulation method and measurement method perform.
13045-18
Author(s): Eric A. Flug, Kiran Bagalkotkar, Bradley L. Preece, John Graybeal, Trevor Fitzsimmons, DEVCOM C5ISR (United States); Vinh Tran, Simeon Technology LLC (United States)
24 April 2024 • 4:10 PM - 4:30 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
Parallax is an important consideration when using helmet mounted cameras, as the image offset can impair navigation and object interactions. This study looks at various methods to correct for parallax effects using image simulation and a virtual reality headset. Qualitative assessments were made from a group of observers and result are provided for recommended algorithms and optimal settings.
13045-19
Author(s): Joshua Teague, Orges Furxhi, Joshua Follansbee, Ronald G. Driggers, The Univ. of Arizona (United States); Mark F. Spencer, Joint Directed Energy Transition Office (United States)
24 April 2024 • 4:30 PM - 4:50 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
Continuous Wave (CW) and Laser Range Gated (LRG) are two widely used and effective system design techniques pertaining to active imaging systems for long-range target discrimination and acquisition. Two more recent system design methods are Continuous Wave Time-of-Flight (CWToF) and Laser Range Gated with Range Resolve (LRGRR) active imaging systems. While these two techniques involve a higher degree of complexity in terms of system design, they also aim to provide the user with higher resolution and more sensitive imaging capabilities. In this study, we will quantify the sensitivity and range resolution benefits of these more complex methods in comparison to their more fundamental counterparts (CW and LRG). We will provide a performance model comparing these methods and discuss some environmental and situational circumstances in which any of these approaches would prove to be superior to the others.
13045-20
Author(s): Jonathon Wade, Patrick Leslie, The Univ. of Arizona (United States); Thomas Watson, University of Memphis (United States); Tony Ragucci, Leonardo DRS (United States); Anne Lautzenheiser, PM Apache (United States); Ronald G. Driggers, The Univ. of Arizona (United States)
24 April 2024 • 4:50 PM - 5:10 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
Pilotage is used as a tool to help pilots fly without hitting obstructions and to give the pilot increased situational awareness. Pilotage with rotorcraft tends to be more difficult compared to fixed wing craft as they generally fly at a lower altitude which increases the angular velocities of the ground below and other features nearby with respect to the craft. This increase in angular velocity of the scene results in increased blur in an imaging system that is being used to gather imagery for the piloting of the craft. This increased blur will lower the system’s Modulation Transfer Function (MTF) and thus will lower the Target Task Performance (TTP) metric. Data will be collected from a helicopter from different perspectives, using cameras that are well characterized which will also be shown, to give different levels of blurring in the imagery. The imagery will then be used to calculate the amount of blur, in pixels, that will then be used to calculate a new degraded MTF and TTP metric.
13045-21
Author(s): Timothy Tuininga, MAG Aerospace (United States); David P. Haefner, Stephen D. Burks, DEVCOM C5ISR (United States)
24 April 2024 • 5:10 PM - 5:30 PM EDT | National Harbor 3
Show Abstract + Hide Abstract
A camera's location can be calculated by observing pre-determined fiducials. An easy, open-source option for this is OpenCV's ArUco algorithm. This paper attempts to characterize the performance of OpenCV's ArUco algorithm. It does this by virtually generating ArUco markers while varying the input parameters, such as location, orientation, blur, illumination, and camera attributes. By using this virtually constructed imagery as an input for the algorithm, we can observe the effect of each parameter. This information can affect how we use ArUco markers in real-life applications.
Session 5: Infrared Imaging Systems V
25 April 2024 • 8:10 AM - 10:00 AM EDT | National Harbor 6
Session Chairs: Jonathan G. Hixson, DEVCOM C5ISR (United States), Eddie L. Jacobs, The Univ. of Memphis (United States)
Note room change on Thursday.
13045-22
Author(s): Curtis M. Webb, L3Harris Technologies, Inc. (United States)
25 April 2024 • 8:10 AM - 8:40 AM EDT | National Harbor 6
Show Abstract + Hide Abstract
History of FLIR Testing Revisited Curtis Webb Abstract This author wrote a paper in 2003 for the SPIE conference summarizing the development of the procedures used to test and verify the performance of Forward-Looking Infrared (FLIR) systems. In the 20 years since the presentation of that paper, FLIR technology has significantly changed, including many of the key premises and rationales for testing and verification, along with the development of new techniques. This paper will review the early development of the test techniques that coincided with historic modeling and field testing. Some of the basic theories that served as the cornerstone for early testing and modeling (Johnson Criteria, for instance) have been lost and are not familiar to the test engineers of today. Over the years, new testing techniques have been developed, and new FLIR technologies have emerged (e.g. sampled systems, image processing, etc.). This paper will review the early days of testing and they’re relation to modeling and field testing data, and will examine new techniques that are paving the way for providing advanced understanding of how the systems of today will work in the field.
13045-23
Author(s): Stephen D. Burks, DEVCOM C5ISR (United States); Jonah P. Sengupta, Army Research Lab (United States); David P. Haefner, David K. Lee, DEVCOM C5ISR (United States)
25 April 2024 • 8:40 AM - 9:00 AM EDT | National Harbor 6
Show Abstract + Hide Abstract
Neuromorphic sensors (also known as event based cameras) behave differently than traditional imaging sensors as they respond only to changes in stimuli as they occur. They typically have higher dynamic range and frame rates than traditional imaging systems while using less power than other imaging systems because a pixel only outputs data when a stimulus occurs at that pixel. There are a variety of uses for neuromorphic sensors from temporal anomaly detection to autonomous driving. While the information in the output of the neuromorphic sensor correlates to a change in stimuli, there has not been a defined means to characterize neuromorphic sensors in order to predict performance from a given stimuli. This study focuses on the measurement of the temporal and spatial response of a neuromorphic sensor with additional discussion on model performance based upon these measurements.
13045-24
Author(s): Jonah P. Sengupta, U.S. Army Research Lab. (United States); Stephen D. Burks, DEVCOM C5ISR (United States)
25 April 2024 • 9:00 AM - 9:20 AM EDT | National Harbor 6
Show Abstract + Hide Abstract
This report explores how various mechanisms effect the response time of event-based cameras (EBCs). EBCs are based on unconventional electro-optical IR vision sensors, which are only sensitive to changing light. Because their operation is essentially “frameless,” their response time is not dependent to a frame rate or readout time, but rather the number of activated pixels, the magnitude of background light, local fabrication defects, and analog configuration of the pixel. A test apparatus was devised using a commercial off-the-shelf EBC to extract the sensor latency’s dependence on each parameter. Under various illumination levels, results show a mean latency and temporal jitter can increase by a factor of 10 via configuring bias parameters. Furthermore, worst-case latency can exceed 1–2 ms even when 0.005% of the array is activated simultaneously. These and many other findings in the report hope to inform use of event-based sensing technology when latency is a critical component of successful application.
13045-25
Author(s): Antoine Grégoire, Nathalie Roy, Simon Roy, Simon Potvin, Michel Dupuis, Anne Martel, Jean-Claude Bouchard, Defence Research and Development Canada, Valcartier (Canada)
25 April 2024 • 9:20 AM - 9:40 AM EDT | National Harbor 6
Show Abstract + Hide Abstract
Latency in augmented vision systems can be defined as the total delay imposed on information propagating through a device with respect to a direct path. Latency is critically important in vision systems as it imposes a delay on reaction time. With the emergence of headborne augmented vision systems for dismounted soldiers and widespread usage of embedded digital processing in vision systems, latency becomes most critical in dynamic operational scenarios. As consequence, latency has been characterized in the recent years for various technologies including AR headsets, VR headsets and pilot helmets with integrated symbology overlay and night vision. These efforts have led to latency requirements that vary according to the application. However, as there is no standardized definition and testing methodology for latency in vision devices, it is difficult to compare latency values across devices and as stated by different manufacturers. We propose that latency be characterized as a set and not as a single value.
13045-26
Author(s): Patrick Leslie, Joshua Follansbee, The Univ. of Arizona (United States); Thomas P. Watson, The Univ. of Memphis (United States); Shane Jordan, Lindsey Wiley, The Univ. of Arizona (United States); Eddie L. Jacobs, The Univ. of Memphis (United States); Ronald G. Driggers, The Univ. of Arizona (United States)
25 April 2024 • 9:40 AM - 10:00 AM EDT | National Harbor 6
Show Abstract + Hide Abstract
In recent decades, wildfires have become increasingly widespread and hazardous. To protect private property and mitigate the damage, Hot Shot fire fighters are deployed into dangerous situations. sUAS (small unmanned aerial system) based EO/IR systems provide a real-time, high resolution, targeted response to acquire information critical to the safety and efficacy of wildfire mitigation. With the correct EO/IR sensors on this platform, a system to monitor the locations of firefighters and the fire can give the Hot Shots a safety advantage. The longer wavelength infrared bands have demonstrated imaging through the smoke of forest fires. However, the longer infrared wavelengths also receive a strong radiometric signal from the temperature of the smoke and fire. The emissive signatures of the smoke can obscure the sensors view of the firefighters. The reflected and emitted radiance of humans, terrain, fire, and smoke are studied in the Visible (0.4-0.7um), SWIR (1.0-1.7um), eSWIR (2.0-2.6um) and LWIR (8-14um). An imaging system that has good contrast to image people while not limited by smoke signatures or dynamic range can be created in the band with the best performance in wildfires
Break
Coffee Break 10:00 AM - 10:30 AM
Session 6: Infrared Imaging Systems VI
25 April 2024 • 10:30 AM - 12:10 PM EDT | National Harbor 6
Session Chairs: Eddie L. Jacobs, The Univ. of Memphis (United States), Jonathan G. Hixson, DEVCOM C5ISR (United States)
Note room change on Thursday.
13045-27
Author(s): Arnold L. Adams, IRCameras, LLC (United States)
25 April 2024 • 10:30 AM - 10:50 AM EDT | National Harbor 6
Show Abstract + Hide Abstract
System level modeling and testing of SWIR, MWIR and LWIR infrared cameras are presented. Automated camera testing is performed by Santa Barbara Infrared's IRWindows software. Some of the standard tests are NETD, MTF, 3D noise, and Ensquared Energy. A user-friendly Excel spreadsheet has been developed to model camera system performance and allows relatively easy addition of new camera types. Each camera type includes information on the ROIC (number & format of pixels, pixel readout rate, pixel size, well size, read noise), sensor material (spectral QE, fill factor, dark current vs temperature), lens, window and warm filter spectral transmission, cold filter temperature and spectral transmission, and cold aperture configuration (diameter and distance from sensor). Atmospheric temperature, altitude, relative humidity, turbulence, spectral absorption and spectral emission are included in the Excel model.
13045-28
Author(s): Derek J. Burrell, Allen J. Parker, David P. Haefner, Stephen D. Burks, DEVCOM C5ISR (United States)
25 April 2024 • 10:50 AM - 11:10 AM EDT | National Harbor 6
Show Abstract + Hide Abstract
Direct measurement of F-number presents a known challenge in characterizing electro-optical and infrared imaging systems. Conventional methods typically require the lens to be evaluated separately from the focal plane, indirectly calculating F-number from measurements of effective focal length and entrance-pupil diameter. We demonstrate a novel measurement routine that uses focal-plane retroreflections to determine F-number, along with an unconventional knife-edge technique for gauging the entrance-pupil diameter in situ. Together these two measurements enable us to calculate effective focal length (and in turn pixel pitch by measuring instantaneous field of view) for a comprehensive in situ system description. We further show that a working F-number and effective image distance are attainable through this method for finite-conjugate systems. These tools improve our ability to update existing system models with objective measurements.
13045-29
Author(s): Nathalie Roy, Simon Potvin, Anne Martel, Defence Research and Development Canada, Valcartier (Canada); Michel Dupuis, Jean-Claude Bouchard, Defence Research and Development Canada (Canada); Antoine Grégoire, Defence Research and Development Canada, Valcartier (Canada)
25 April 2024 • 11:10 AM - 11:30 AM EDT | National Harbor 6
Show Abstract + Hide Abstract
Testing of Night Vision Devices (NVD) and I2 tubes are regulated by a long series of US Defense standards (often called military standards or MIL standards). These standards set mandatory testing conditions to be fulfilled. Among others, the radiation source used in the tests shall be a tungsten filament lamp operated at a color temperature of 2856 kelvins (K), ±50 K. In recent years, we have noticed that those tungsten filament lamp with a sufficient spectral shape accuracy and stability have been harder to procure. In this paper, we present our characterization efforts to determine if a commercially available LED-based light source is suitable to replace a tungsten filament lamp for NVDs and I2 tubes testing. A LED-based light source is compared to a 2856 K filament lamp in terms of spectral shape, output power linearity, dynamic range and relative intensity noise (RIN). We also present the pros and cons of the two sources in a perspective of evaluating NVD performance in a controlled environment emulating different representative night sky irradiances in support of military and law enforcement operations.
13045-30
Author(s): Allen J. Parker, David P. Haefner, Stephen D. Burks, DEVCOM C5ISR (United States)
25 April 2024 • 11:30 AM - 11:50 AM EDT | National Harbor 6
Show Abstract + Hide Abstract
Modern electro-optical systems increasingly incorporate multiple electro-optical sensors, each adding unique wavebands, alignment considerations, and distortion effects. Laboratory testing of these systems traditionally requires multiple measurement setups to determine metrics. A multi-spectral scene has been created to support many simultaneous, objective measurements from a single mounting position. In some cases, a multi-spectral scene is the only way to test new system-of-system type units because traditional tests don’t engage with built-in algorithms. In 2023 Parker et. al. developed a diverse multi-band scene with a diverse target set in order to test camera systems. In this correspondence, we describe a comprehensive and precise calibration of the scene. Among the methods used was a pair of reference cameras translated across the entire field of view. The transformation matrices were determined to map pixel locations to angle; subsequent imaging of the target scene will yield precise locations of each feature, and comparisons between modelled and recorded images based on varied camera positions will validate the success of the calibration.
13045-31
Author(s): Ozgur M. Polat, Yücel C. Özer, ASELSAN A.S. (Turkey)
25 April 2024 • 11:50 AM - 12:10 PM EDT | National Harbor 6
Show Abstract + Hide Abstract
Electro-Optical systems are designed for purposes such as detection/recognition/identification and tracking of object(s). In order to design the systems in an optimum manner, there are many processes involved in generation of the image of a target point at the system detector output, and these components should be carefully examined for various conditions. An image based system performance prediction tool has been developed for generating synthetic images/videos in order to be used for estimation and optimization of system performance. This paper introduces this image based performance prediction tool/scene generator in a system designer point of view and demonstrates some properties of the tool which may be useful for system analyzers/designers for optimization. The synthetic scenes can be generated either via parametric models and/or radiometric measurements for EO system, environment, and object signature. In addition, this tool can be used for generating synthetic data via constructing a big data set for traditional and learning based algorithms.
Break
Lunch/Exhibition Break 12:10 PM - 1:30 PM
Round-Table Discussion: Future Testing Requirements
25 April 2024 • 1:30 PM - 2:30 PM EDT | National Harbor 6

Note room change on Thursday.

Moderator:
Curtis M. Webb, L3Harris Technologies, Inc. (United States)

Panelists:
Stephen Burks, DEVCOM C5ISR (United States)
Chris Durell, Remote Sensing Labsphere, Inc. (United States)
Al Gibson, Santa Barbara Infrared, Inc. (United States)
Ilya Koshkin, CI Systems (United States)
Jeffrey T. Meier, US Army Redstone Technical Test Ctr. (United States)
Austin Richards, Oculus Photonics (United States)
Nathalie Roy, Defence Research and Development Canada, Valcartier (Canada)

13045-702
Author(s): Curtis M. Webb, L3Harris Technologies, Inc. (United States); Stephen D. Burks, DEVCOM C5ISR (United States); Christopher N. Durell, Labsphere, Inc. (United States); Al Gibson, Santa Barbara Infrared, Inc. (United States); Ilya Koshkin, CI Systems, Inc. (United States); Jeffrey T. Meier, U.S. Army Redstone Technical Test Ctr. (United States); Austin A. Richards, Oculus Photonics LLP (United States); Nathalie Roy, Defence Research and Development Canada, Valcartier (Canada)
25 April 2024 • 1:30 PM - 2:30 PM EDT | National Harbor 6
Break
Coffee Break 2:30 PM - 3:00 PM
Session 7: Infrared Imaging Systems VII
25 April 2024 • 3:00 PM - 4:40 PM EDT | National Harbor 6
Session Chair: Curtis M. Webb, L3Harris Technologies, Inc. (United States)
Note room change on Thursday.
13045-32
Author(s): Stephen D. Burks, DEVCOM C5ISR (United States); Linda E. Marchese, RaySecur Inc. (United States); David P. Haefner, Brian P. Teaney, DEVCOM C5ISR (United States); Miles Cathers, RaySecur Inc. (United States)
25 April 2024 • 3:00 PM - 3:20 PM EDT | National Harbor 6
Show Abstract + Hide Abstract
Terahertz (THz) imaging systems use active sources, specialized optics, and detectors in order to penetrate through certain materials.  Each of these components have design and manufacturing characteristics (e.g. coherence for sources, aberrations for optics, and dynamic range and noise in detectors) that can lead to a non-ideal performance for the overall imaging system.  Thus, system designers are frequently challenged to designs systems that approach theoretical performance, making quantitative measurement of imaging performance a key feedback element of system design. Quantitative evaluation of actual THz system performance will be performed using many of the same figures of merit that have been developed for imaging at other wavelengths (e.g. infrared imaging systems nominally operating in the shorter 3-12 um wavelength range). The suitability and limitations of these evaluation criteria will be analyzed as part of the process for improving the modeling and design of high performance THz imaging systems.
13045-33
Author(s): Augusto Cezar Gomes dos Santos, Adenir Silva Filho, Fábio L. Firmino, Leonardo Bruno de Sá, Ctr. Tecnologico do Exercito (Brazil); Cristian A. Delfino, Instituto de Estudos Avançados (Brazil)
25 April 2024 • 3:20 PM - 3:40 PM EDT | National Harbor 6
Show Abstract + Hide Abstract
Characterizing Longwave Infrared Focal Plane Arrays (LWIR FPA) is crucial for a wide range of applications in both industrial and research and development (R&D) domains. Previous research has explored reference temperature selection, integration time, and frame rate. Our research focuses on an in-depth analysis of LWIR FPA temperature dynamics. To facilitate this, we developed electronic circuitry for precise temperature control. Within this controlled framework, we thoroughly evaluated fundamental performance metrics, including operability, uniformity, signal transfer function (SiTF), and noise equivalent temperature difference (NETD). Our findings highlight the significant impact of LWIR FPA temperature variations on the statistical behavior of frames. This leads to a substantial increase in malfunctioning pixels, reducing array operability from approximately 99.5% to 39.4%. Additionally, sensor array uniformity experiences marked variability, decreasing from 95.9% to 66.2% as temperature rises. Our observations also revealed NETD values ranging from 32 to 41mk at an aperture of f/1.
13045-34
Author(s): Jordan L. Rubis, Patrick Leslie, The Univ. of Arizona (United States); Jeffery T. Meier, Ellie Spitzer, U.S. Army Redstone Test Ctr. (United States); Eddie L. Jacobs, The Univ. of Memphis (United States); Ronald G. Driggers, Wyant College of Optical Sciences, The Univ. of Arizona (United States)
25 April 2024 • 3:40 PM - 4:00 PM EDT | National Harbor 6
Show Abstract + Hide Abstract
Modulation Transfer Functions (MTFs) of a sensor system are ideally measured in a laboratory setting resulting in optimal performance. When the sensor system is used for practical applications in the field, the performance is not optimal, and it is important to know how a sensor will perform in the environment it is to be used in. Field MTFs capture the environmental and platform induced blurs that degrade image quality. Despite the potential sources of blur, there are techniques to obtain a static field MTF that is comparable to one you would get in a laboratory setting.
13045-35
Author(s): David A. Vaitekunas, W. R. Davis Engineering, Ltd. (Canada); Moses Kodur, Martin Szczesniak, Surface Optics Corp. (United States)
25 April 2024 • 4:00 PM - 4:20 PM EDT | National Harbor 6
Show Abstract + Hide Abstract
This paper will present a comprehensive study on the measurement, modeling, and simulation of the optical properties of wet surface paints. Low observable paints are designed to camouflage the optical signature of a system by imitating the background thermal signature and scattering incident light (visible and IR). These properties are well studied for pristine conditions but their optical properties in real conditions, wet and at cold temperatures, are less known. Herein, we present an in-situ measurement of dry, wet, and icy paint samples commonly used for thermal signature management. The collected data is analyzed for input to ShipIR based on a derived nominal (diffuse) emissivity and specular reflectivity versus incidence angle using the Sanford-Robertson approximation, where the angular and spectral properties of surface reflectance are separable. ). A current and modified version of the ShipIR wetted surface reflectance model will be compared against the optical properties obtained by the SOC reflectometers.
13045-40
Author(s): Angus Hendrick, Lindsey Wiley, The Univ. of Arizona (United States); Silviu Velicu, EPIR (United States); Rich Pimpinella, Episensors (United States); John Lund, Army Rapid Capabilities and Critical Technologies Office (United States); Ron Driggers, The Univ. of Arizona (United States)
25 April 2024 • 4:20 PM - 4:40 PM EDT | National Harbor 6
Show Abstract + Hide Abstract
In the last half-decade, the extended shortwave infrared (eSWIR) atmospheric band has received attention for its potential to provide better object discrimination than the classical shortwave, midwave, and longwave infrared bands in degraded visual environments such as smoke, dust, and smog. Efforts to characterize the band in these environments have provided interesting results with potential applications in degraded targeting conditions. However, any detection band is only as useful as the best available detector, and thus an investigation into the optimization of detectors for use in the eSWIR band is necessary before any applications are put into practice. Modeling was conducted for both passive and active imaging, wherein the effects of read noise, dark current, pixel size and pitch, quantum efficiency (QE), well depth, and integration time on the system's effective range for a target recognition task were examined. These parameters were then ranked in importance to an eSWIR's imaging system's performance.
13045-36
CANCELED: Rifle scope tester: optical system description and application
Author(s): Dario Cabib, CI Systems (Israel) Ltd. (Israel)
25 April 2024 • 4:20 PM - 4:40 PM EDT | National Harbor 6
Show Abstract + Hide Abstract
The Rifle Scope Tester of CI is a new optical tool developed to accurately measure important parameters of rifle scopes. Its uniqueness and main advantage is that it is a general riflescope tester that can be used for day rifle scopes, NV riflescopes and thermals, along with clip-ons. The tool tests eyepiece ocular focus and reticle alignment, magnification, parallax distance at maximum magnification, diopter shift at different magnification positions, and maximum diopter shift of the eyepiece focus adjustment range. In addition, it can be used in production to calibrate the magnification scale, and over time, to measure its stability and accuracy. A Live MTF feature of the instrument algorithm/software is used to characterize the image quality during production and testing of the system. In this paper we describe the instrument design, how it is operated, and we give examples of actual results in production."
Conference Chair
DEVCOM C5ISR (United States)
Conference Chair
JCD Publishing (United States)
Program Committee
MEPSS LLC (United States)
Program Committee
TNO (Netherlands)
Program Committee
Fraunhofer-Institut für Optronik, Systemtechnik und Bildauswertung IOSB (Germany)
Program Committee
Blake Labs, LLC (United States)
Program Committee
RTX Corp. (United States)
Program Committee
Wyant College of Optical Sciences (United States)
Program Committee
Labsphere, Inc. (United States)
Program Committee
U.S. Naval Research Lab. (United States)
Program Committee
True Colors Infrared Imaging (United States)
Program Committee
DEVCOM C5ISR (United States)
Program Committee
The Univ. of Memphis (United States)
Program Committee
U.S. Dept. of Transportation (United States)
Program Committee
The Aerospace Corp. (United States)
Program Committee
CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States)
Program Committee
DEVCOM C5ISR (United States)
Program Committee
Oculus Photonics LLP (United States)
Program Committee
Teledyne FLIR LLC (United States)
Program Committee
Santa Barbara Infrared, Inc. (United States)
Program Committee
L3Harris Technologies, Inc. (United States)