Proceedings Volume 9839

Degraded Visual Environments: Enhanced, Synthetic, and External Vision Solutions 2016

cover
Proceedings Volume 9839

Degraded Visual Environments: Enhanced, Synthetic, and External Vision Solutions 2016

Purchase the printed version of this volume at proceedings.com or access the digital version at SPIE Digital Library.

Volume Details

Date Published: 7 July 2016
Contents: 9 Sessions, 23 Papers, 0 Presentations
Conference: SPIE Defense + Security 2016
Volume Number: 9839

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 9839
  • System Performance Evaluation
  • Symbology and Synthetic Vision
  • Advanced Vision Systems for Commercial Flight
  • Human Performance Evaluation
  • Image Processing and Display
  • Advanced DVE Sensing
  • Display Advances, Applications, and Future Design
  • Head-up and Body-worn Displays
Front Matter: Volume 9839
icon_mobile_dropdown
Front Matter: Volume 9839
This PDF file contains the front matter associated with SPIE Proceedings Volume 9839 including the Title Page, Copyright information, Table of Contents, Introduction, and Conference Committee listing.
System Performance Evaluation
icon_mobile_dropdown
Flight test results of helicopter approaches with trajectory guidance based on in-flight acquired LIDAR data
This paper presents flight test results for online calculated approach trajectories using DLR’s manned research helicopter. This highly modified EC135 is equipped with a commercial forward-looking Light Detection and Ranging (LIDAR) sensor with a range of 1 km. During an approach to an unmapped landing site, geo-referenced LIDAR samples are acquired and combined with a priori information. The resulting representation of the environment is used for the generation of trajectories which are collision free, technically feasible and acceptable for pilots. Once new samples are collected by the LIDAR sensor, the environment map is updated in real time and the trajectory is changed based on typical approach procedures if necessary. Due to experimental aspects, a manual trajectory following was used by providing the pilot with a “Tunnel-In-The-Sky” head down display including visual cues for spatial and speed guidance during the approach.
Toward autonomous rotorcraft flight in degraded visual environments: experiments and lessons learned
Adam Stambler, Spencer Spiker, Marcel Bergerman, et al.
Unmanned cargo delivery to combat outposts will inevitably involve operations in degraded visual environments (DVE). When DVE occurs, the aircraft autonomy system needs to be able to function regardless of the obscurant level. In 2014, Near Earth Autonomy established a baseline perception system for autonomous rotorcraft operating in clear air conditions, when its m3 sensor suite and perception software enabled autonomous, no-hover landings onto unprepared sites populated with obstacles. The m3’s long-range lidar scanned the helicopter’s path and the perception software detected obstacles and found safe locations for the helicopter to land. This paper presents the results of initial tests with the Near Earth perception system in a variety of DVE conditions and analyzes them from the perspective of mission performance and risk. Tests were conducted with the m3’s lidar and a lightweight synthetic aperture radar in rain, smoke, snow, and controlled brownout experiments. These experiments showed the capability to penetrate through mild DVE but the perceptual capabilities became degraded with the densest brownouts. The results highlight the need for not only improved ability to see through DVE, but also for improved algorithms to monitor and report DVE conditions.
Capability comparison of pilot assistance systems based solely on terrain databases versus sensor DB fused data systems
Thomas Münsterer, Jürgen Scheuch, Philipp Völschow, et al.
This paper compares the capabilities achievable with a pure database driven DVE system (NIAG class 4 system) with the capabilities of a database system fused with sensor information (NIAG class 2 system). Both systems will present the same 3D conformal symbology. To achieve terrain conformal representation with operational navigation systems and databases, specific compensation techniques are required. This applies especially for the pure database system. The sensor database fusion system on the other hand relies mainly on relative accuracies simplifying the required compensation techniques at the cost of additional sensors and fusion algorithms. Both system configurations were flight-tested on a test helicopter. The test results, specifics and basic limitations will be discussed and compared.
Performance evaluation of active sub-Terahertz systems in Degraded Visual Environments (DVE)
Romain Ceolato, Bernard Tanguy, Christian Martin, et al.
This paper addresses the problem of critical operations in Degraded Visual Environment (DVE). DVE usually refer when the perception of a pilot is degraded by environmental factors, including the presence of obscurants from bad weather (e.g. fog, rain, snow) or accidental events (e.g. brownout, whiteout, smoke). Critical operations in DVE are a growing field of research as it is a cause of numerous fatal accidents for operational forces. Due to the lack of efficient sources and sensors in the Terahertz (THz) region, this domain has remained an unexplored part of the electromagnetic spectrum. Recently, the potential use of sub-Terahertz waves has been proposed to see through dense clouds of obscurants (e.g. sand, smoke) in DVE conditions. In order to conduct a performance evaluation of sub-Terahertz systems, several sub-terahertz systems (e.g. bolometer-array cameras, liquid helium cooled bolometers) were operated in artificial controlled DVE conditions at ONERA facilities. The purpose of this paper is to report field experiments results in controlled DVE conditions: attenuation measurements from 400 GHz to 700 GHz with a performance evaluation of different sub-Terahertz systems are presented.
Symbology and Synthetic Vision
icon_mobile_dropdown
The glass dome: low-occlusion obstacle symbols for conformal displays
Niklas Peinecke, Alvaro Chignola, Daniela Schmid, et al.
Contemporary helmet mounted displays integrate high-resolution display units together with precise head-tracking solutions. This combination offers the opportunity to show symbols in a conformal way. Conformality here means that a hazard symbol is linked to the outside scenery. Thus, a pilot intuitively understands the connection between the symbol and its corresponding terrain feature, even if the feature is not fully visible due to degraded visual conditions. To accomplish this purpose the symbol has to be sufficiently noticeable in terms of size and brightness. However, this gives rise to the danger that parts of the outside scenery are occluded by the symbol. Furthermore, symbols should not clutter the display, in order not to distract the pilot. We present a solution framework of highlighting obstacles by symbols that balance low occlusion against noticeability. Our concept allows including different representations for individual classes of obstacles in a unified way. We detail the implementation of the display symbols. Finally, we present results of a first acceptance test with pilots.
Amplifying the helicopter drift in a conformal HMD
Sven Schmerwitz, Patrizia M. Knabl, Thomas Lueken, et al.
Helicopter operations require a well-controlled and minimal lateral drift shortly before ground contact. Any lateral speed exceeding this small threshold can cause a dangerous momentum around the roll axis, which may cause a total roll over of the helicopter. As long as pilots can observe visual cues from the ground, they are able to easily control the helicopter drift. But whenever natural vision is reduced or even obscured, e.g. due to night, fog, or dust, this controllability diminishes. Therefore helicopter operators could benefit from some type of “drift indication” that mitigates the influence of a degraded visual environment. Generally humans derive ego motion by the perceived environmental object flow. The visual cues perceived are located close to the helicopter, therefore even small movements can be recognized. This fact was used to investigate a modified drift indication. To enhance the perception of ego motion in a conformal HMD symbol set the measured movement was used to generate a pattern motion in the forward field of view close or on the landing pad. The paper will discuss the method of amplified ego motion drift indication. Aspects concerning impact factors like visualization type, location, gain and more will be addressed. Further conclusions from previous studies, a high fidelity experiment and a part task experiment, will be provided. A part task study will be presented that compared different amplified drift indications against a predictor. 24 participants, 15 holding a fixed wing license and 4 helicopter pilots, had to perform a dual task on a virtual reality headset. A simplified control model was used to steer a “helicopter” down to a landing pad while acknowledging randomly placed characters.
A concept for a virtual flight deck shown on an HMD
A combination of see-through head-worn or helmet-mounted displays (HMDs) and imaging sensors is frequently used to overcome the limitations of the human visual system in degraded visual environments (DVE). A visual-conformal symbology displayed on the HMD allows the pilots to see objects such as the landing site or obstacles being invisible otherwise. These HMDs are worn by pilots sitting in a conventional cockpit, which provides a direct view of the external scene through the cockpit windows and a user interface with head-down displays and buttons. In a previous publication, we presented the advantages of replacing the conventional head-down display hardware by virtual instruments. These virtual aircraft-fixed cockpit instruments were displayed on the Elbit JEDEYE system, a binocular, see-through HMD. The idea of our current work is to not only virtualize the display hardware of the flight deck, but also to replace the direct view of the out-the-window scene by a virtual view of the surroundings. This imagery is derived from various sensors and rendered on an HMD, however without see-through capability. This approach promises many advantages over conventional cockpit designs. Besides potential weight savings, this future flight deck can provide a less restricted outside view as the pilots are able to virtually see through the airframe. The paper presents a concept for the realization of such a virtual flight deck and states the expected benefits as well as the challenges to be met.
Helmet mounted display supporting helicopter missions during en route flight and landing
Degraded visual environment is still a major problem for helicopter pilots especially during approach and landing. Particularly with regard to the landing phase, pilot’s eyes must be directed outward in order to find visual cues as indicators for drift estimation. If lateral speed exceeds the limits it can damage the airframe or in extreme cases lead to a rollover. Since poor visibility can contribute to a loss of situation awareness and spatial disorientation, it is crucial to intuitively provide the pilot with the essential visual information he needs for a safe landing. With continuous technology advancement helmet-mounted displays (HMD) will soon become a spreading technology, because look through capability is an enabler to offer monitoring the outside view while presenting flight phase depending symbologies on the helmet display. Besides presenting primary flight information, additional information for obstacle accentuation or terrain visualization can be displayed on the visor. Virtual conformal elements like 3D pathway depiction or a 3D landing zone representation can help the pilot to maintain control until touchdown even during poor visual conditions. This paper describes first investigations in terms of both en route and landing symbology presented on a helmet mounted display system in the scope of helicopter flight trials with DLR’s flying helicopter simulator ACT/FHS.
Advanced Vision Systems for Commercial Flight
icon_mobile_dropdown
Present and future of vision systems technologies in commercial flight operations
Jim Ward
The development of systems to enable pilots of all types of aircraft to see through fog, clouds, and sandstorms and land in low visibility has been widely discussed and researched across aviation. For military applications, the goal has been to operate in a Degraded Visual Environment (DVE), using sensors to enable flight crews to see and operate without concern to weather that limits human visibility. These military DVE goals are mainly oriented to the off-field landing environment. For commercial aviation, the Federal Aviation Agency (FAA) implemented operational regulations in 2004 that allow the flight crew to see the runway environment using an Enhanced Flight Vision Systems (EFVS) and continue the approach below the normal landing decision height. The FAA is expanding the current use and economic benefit of EFVS technology and will soon permit landing without any natural vision using real-time weather-penetrating sensors. The operational goals of both of these efforts, DVE and EFVS, have been the stimulus for development of new sensors and vision displays to create the modern flight deck.
Human Performance Evaluation
icon_mobile_dropdown
Assessing impact of dual sensor enhanced flight vision systems on departure performance
Lynda J. Kramer, Timothy J. Etherington, Kurt Severance, et al.
Synthetic Vision (SV) and Enhanced Flight Vision Systems (EFVS) may serve as game-changing technologies to meet the challenges of the Next Generation Air Transportation System and the envisioned Equivalent Visual Operations (EVO) concept – that is, the ability to achieve the safety and operational tempos of current-day Visual Flight Rules operations irrespective of the weather and visibility conditions. One significant obstacle lies in the definition of required equipage on the aircraft and on the airport to enable the EVO concept objective. A motion-base simulator experiment was conducted to evaluate the operational feasibility and pilot workload of conducting departures and approaches on runways without centerline lighting in visibility as low as 300 feet runway visual range (RVR) by use of onboard vision system technologies on a Head-Up Display (HUD) without need or reliance on natural vision. Twelve crews evaluated two methods of combining dual sensor (millimeter wave radar and forward looking infrared) EFVS imagery on pilot-flying and pilot-monitoring HUDs. In addition, the impact of adding SV to the dual sensor EFVS imagery on crew flight performance and workload was assessed. Using EFVS concepts during 300 RVR terminal operations on runways without centerline lighting appears feasible as all EFVS concepts had equivalent (or better) departure performance and landing rollout performance, without any workload penalty, than those flown with a conventional HUD to runways having centerline lighting. Adding SV imagery to EFVS concepts provided situation awareness improvements but no discernible improvements in flight path maintenance.
Measuring the denoising performance of the human visual system for optimum display quality
The human visual system (HVS) is a complicated network of filters and algorithms evolved to provide humans with an optimal set of inputs for the task at hand. Temporal and spatial averaging, matched filter analysis, variable gain settings, real time adjustments and feedback – all of these are seamlessly available to humans as they view the world around them via the HVS. In certain situations, however, these abilities may be limited by circumstances necessitated by the task, such as an intermediate display from an external sensor, constrained viewing distance or gain settings, etc. In order to improve the performance of individuals in these situations, a more thorough understanding of how the HVS compensates and performs is required. This paper investigates the denoising performance of the HVS in the presence of noise and various display settings to establish a baseline for optimal display adjustment quality under environmental or system constraints.
Perceptual issues for color helmet-mounted displays: luminance and color contrast requirements
Thomas H. Harding, Clarence E. Rash, Morris R. Lattimore, et al.
Color is one of the latest design characteristics of helmet-mounted displays (HMDs). It’s inclusion in design specifications is based on two suppositions: 1) color provides an additional method of encoding information, and 2) color provides a more realistic, and hence more intuitive, presentation of information, especially pilotage imagery. To some degree, these two perceived advantages have been validated with head-down panel-mounted displays, although not without a few problems associated with visual physiology and perception. These problems become more prevalent when the user population expands beyond military aviators to a general user population, of which a significant portion may have color vision deficiencies. When color is implemented in HMDs, which are eyes-out, see-through displays, visual perception issues become an increased concern. A major confound with HMDs is their inherent see-through (transparent) property. The result is color in the displayed image combines with color from the outside (or in-cockpit) world, possibly producing a false perception of either or both images. While human-factors derived guidelines based on trial and error have been developed, color HMD systems still place more emphasis on colorimetric than perceptual standards. This paper identifies the luminance and color contrast requirements for see-through HMDs. Also included is a discussion of ambient scene metrics and the choice of symbology color.
Image Processing and Display
icon_mobile_dropdown
Wavelet based image visibility enhancement of IR images
Enhancing the visibility of infrared images obtained in a degraded visibility environment is very important for many applications such as surveillance, visual navigation in bad weather, and helicopter landing in brownout conditions. In this paper, we present an IR image visibility enhancement system based on adaptively modifying the wavelet coefficients of the images. In our proposed system, input images are first filtered by a histogram-based dynamic range filter in order to remove sensor noise and convert the input images into 8-bit dynamic range for efficient processing and display. By utilizing a wavelet transformation, we modify the image intensity distribution and enhance image edges simultaneously. In the wavelet domain, low frequency wavelet coefficients contain original image intensity distribution while high frequency wavelet coefficients contain edge information for the original images. To modify the image intensity distribution, an adaptive histogram equalization technique is applied to the low frequency wavelet coefficients while to enhance image edges, an adaptive edge enhancement technique is applied to the high frequency wavelet coefficients. An inverse wavelet transformation is applied to the modified wavelet coefficients to obtain intensity images with enhanced visibility. Finally, a Gaussian filter is used to remove blocking artifacts introduced by the adaptive techniques. Since wavelet transformation uses down-sampling to obtain low frequency wavelet coefficients, histogram equalization of low-frequency coefficients is computationally more efficient than histogram equalization of the original images. We tested the proposed system with degraded IR images obtained from a helicopter landing in brownout conditions. Our experimental results show that the proposed system is effective for enhancing the visibility of degraded IR images.
A new optical system for low-profile HUD by using a prism waveguide
Masato Tanaka, Yoshiki Arita
A new approach for optical system with waveguides is presented. Recently, Large Area Displays (LADs) tend to be located to airplane cockpits. According to this trend, Low Profile Head Up Displays (LPHUDs) are being developed, which can be installed to the cockpits avoiding interference with LADs by using holographic waveguides. However, the holographic waveguide has difficulties in realizing color display because of it’s own characteristics. We have succeeded in developing a new optical system for LPHUD, which has wide field of view, large head motion box, and excellent color performance, by using a prism waveguide.
Global vision systems regulatory and standard setting activities
Carlo Tiana, Thomas Münsterer
A number of committees globally, and the Regulatory Agencies they support, are active delivering and updating performance standards for vision system: Enhanced, Synthetic and Combined, as they apply to both Fixed Wing and, more recently, Rotorcraft operations in low visibility. We provide an overview of each committee’s present and past work, as well as an update of recent activities and future goals.
Advanced DVE Sensing
icon_mobile_dropdown
Characterization of the OPAL LiDAR under controlled obscurant conditions
Xiaoying Cao, Philip Church, Justin Matheson
Neptec Technologies’ OPAL-120 3D LiDAR is optimized for obscurant penetration. The OPAL-120 uses a scanning mechanism based on the Risley prism pair. The scan patterns are created by rotating two prisms under independent motor control. The geometry and material properties of the prisms define the conical field-of-view of the sensor, which can be built to between 60 to 120 degrees. The OPAL-120 was recently evaluated using a controlled obscurant chamber capable of generating clouds of obscurants over a depth of 22m. Obscurants used in this investigation include: Arizona road dust, water fog, and fog-oil. The obscurant cloud optical densities were monitored with a transmissometer. Optical depths values ranged from an upper value of 6 and progressively decreased to 0. Targets were positioned at the back of the obscurant chamber at a distance of 60m from the LiDAR. The targets are made of a foreground array of equally spaced painted wood stripes in front of a solid background. Reflectivity contrasts were achieved with foreground/background combinations of white/white, white/black and black/white. Data analysis will be presented on the effect of optical densities on range and cross-range resolution, and accuracy. The analysis includes the combinations of all obscurant types and target reflectivity contrasts.
Three-dimensional landing zone ladar
James Savage, Shawn Goodrich, H. N. Burns
Three-Dimensional Landing Zone (3D-LZ) refers to a series of Air Force Research Laboratory (AFRL) programs to develop high-resolution, imaging ladar to address helicopter approach and landing in degraded visual environments with emphasis on brownout; cable warning and obstacle avoidance; and controlled flight into terrain. Initial efforts adapted ladar systems built for munition seekers, and success led to a the 3D-LZ Joint Capability Technology Demonstration (JCTD) , a 27-month program to develop and demonstrate a ladar subsystem that could be housed with the AN/AAQ-29 FLIR turret flown on US Air Force Combat Search and Rescue (CSAR) HH-60G Pave Hawk helicopters. Following the JCTD flight demonstration, further development focused on reducing size, weight, and power while continuing to refine the real-time geo-referencing, dust rejection, obstacle and cable avoidance, and Helicopter Terrain Awareness and Warning (HTAWS) capability demonstrated under the JCTD. This paper summarizes significant ladar technology development milestones to date, individual LADAR technologies within 3D-LZ, and results of the flight testing.
Mapping of ice, snow and water using aircraft-mounted LiDAR
Philip Church, Justin Matheson, Brett Owens
Neptec Technologies Corp. has developed a family of obscurant-penetrating 3D laser scanners (OPAL 2.0) that are being adapted for airborne platforms for operations in Degraded Visual Environments (DVE). The OPAL uses a scanning mechanism based on the Risley prism pair. Data acquisition rates can go as high as 200kHz for ranges within 240m and 25kHz for ranges exceeding 240m. The scan patterns are created by rotating two prisms under independent motor control producing a conical Field-Of-View (FOV). An OPAL laser scanner with 90° FOV was installed on a Navajo aircraft, looking down through an aperture in the aircraft floor. The rotation speeds of the Risley prisms were selected to optimize a uniformity of the data samples distribution on the ground. Flight patterns simulating a landing approach over snow and ice in an unprepared Arctic environment were also performed to evaluate the capability of the OPAL LiDAR to map snow and ice elevation distribution in real-time and highlight potential obstacles. Data was also collected to evaluate the detection of wires when flying over water, snow and ice. Main results and conclusions obtained from the flight data analysis are presented.
Display of real-time 3D sensor data in a DVE system
Philipp Völschow, Thomas Münsterer, Michael Strobel, et al.
This paper describes the implementation of displaying real-time processed LiDAR 3D data in a DVE pilot assistance system. The goal is to display to the pilot a comprehensive image of the surrounding world without misleading or cluttering information. 3D data which can be attributed, i.e. classified, to terrain or predefined obstacle classes is depicted differently from data belonging to elevated objects which could not be classified. Display techniques may be different for head-down and head-up displays to avoid cluttering of the outside view in the latter case. While terrain is shown as shaded surfaces with grid structures or as grid structures alone, respectively, classified obstacles are typically displayed with obstacle symbols only. Data from objects elevated above ground are displayed as shaded 3D points in space. In addition the displayed 3D points are accumulated over a certain time frame allowing on the one hand side a cohesive structure being displayed and on the other hand displaying moving objects correctly. In addition color coding or texturing can be applied based on known terrain features like land use.
Display Advances, Applications, and Future Design
icon_mobile_dropdown
Cooling of organic light-emitting diode display panels with heat pipes
Anita Sure, Gowtham Kumar Vankayala, Vaibhav Baranwal, et al.
Organic light-emitting diode half life is a function of temperature and it decreases with increase in operating temperature. Hence thermal management is important for the efficient operation of OLED based displays. High luminance applications like aerospace cockpits require high power densities which lead to increase in their operating temperatures. Passive cooling is the preferred choice in aerospace applications. In this work passive cooling option with heat pipes is studied and implemented to reduce the display temperature rise.
Head-up and Body-worn Displays
icon_mobile_dropdown
Combatant eye protection: an introduction to the blue light hazard
Morris R. Lattimore
Emerging evidence of metabolic vulnerability to visible blue light is vitally important, as it is indicative of a scalable threshold effect. Added stressors (e.g., increased altitude or contact lens wear) could shift the wavelength effects toward a more damaging clinical picture. Recent reports have indicated rod photo-pigment damage resulting from solar blue-light exposures, adversely affecting unaided night vision, a militarily important performance decrement. The activation wavelength for the daily synchronous setting of the Circadian Clock, which regulates the synchronization of all hormonal and organ systems throughout the body, falls within this blue light perceptual range.
Head-up, eyes-out in day and at night: Striker HMD, evolution or revolution?
Alex Cameron, Ross Hobson
BAE Systems two-part Striker helmet product family is a mature fully integrated helmet mounted display in volume production and used on multiple fixed wing and rotary wing platforms worldwide. The Striker system on Typhoon and Gripen provides a high accuracy off bore sight weapon cueing capability to the fast jet users in eight nations. The advanced all digital rotary wing variant of the Striker HMD is a dedicated helicopter variant which has already been extensively flight tested in U.S., UK, and European platforms. This visor-projected HMD is a mature, advanced integrated display helmet that has been specifically designed for high capability military HMD applications but also has a wider dual use capability in specialist non-military applications. The underlying two-part integrated helmet concept is also a mature design already in wide spread operational use. The combination of these features has enabled the core Striker HMD design concept to be enhanced to meet the demand of both 5th Generation fixed wing platforms and enhanced capabilities for future military and civil rotary wing applications.