Share Email Print

Proceedings Paper

Sensor data/cueing continuum for rotorcraft degraded visual environment operations
Author(s): Joe Minor; Zachariah Morford; Walter Harrington
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Degraded Visual Environments (DVE) can significantly restrict rotorcraft operations during their most common mission profiles of terrain flight and off-airfield operations. The user community has been seeking solutions that will allow pilotage in DVE and mitigate the additional risks and limitations resulting from the degraded visual scene. To achieve this solution there must be a common understanding of the DVE problem, the history of solutions to this point, and the full range of solutions that may lead to future rotorcraft pilotage in the DVE. There are three major technologies that contribute to rotorcraft operations in the DVE: flight control, cueing and sensing, and all three must be addressed for an optimal solution. Increasing aircraft stability through flight control improvements will reduce pilot workload and facilitate operations in both Degraded Visual Environments and Good Visual Environments (GVE) and therefore must be a major piece of all DVE solutions. Sensing and cueing improvements are required to gain a level of situational awareness which can permit low-level flight and off-airfield landings, while avoiding contact with terrain or obstacles which are not visually detectable by the flight crew. The question of how this sensor information is presented to the pilot is a subject of debate among those working to solve the DVE problem. There are two major philosophies in the field of DVE sensor and cueing implementation. The first is that the sensor should display an image which allows the pilot to perform all pilotage tasks as they would fly under visual flight rules (VFR). The second is that the pilot should follow an algorithm-derived, sensor cleared, precision flight path, presented as cues for the pilot to fly as they would under instrument flight rules (IFR). There are also combinations of these two methods that offer differing levels of assistance to the pilots, ranging from aircraft flight symbology overlaid on the sensor image, to symbols that augment the displayed image and help a pilot interpret the scene, to a complete virtual reality that presents a display of the sensed world without any “see-through” capability. These options can utilize two primary means of transmitting a sensor image and cueing information to the pilot: a helmet mounted display (HMD) or a panel mounted display (PMD). This paper will explore the trade space between DVE systems that depend on an image and those that utilize guidance algorithms for both the PMD and HMD as recently demonstrated during the 2016 and 2017 NATO flight trials in the United States, Germany and Switzerland.

Paper Details

Date Published: 5 May 2017
PDF: 17 pages
Proc. SPIE 10197, Degraded Environments: Sensing, Processing, and Display 2017, 101970Y (5 May 2017); doi: 10.1117/12.2262939
Show Author Affiliations
Joe Minor, U.S. Army Aviation Development Directorate (United States)
Zachariah Morford, U.S. Army Aviation Development Directorate (United States)
Walter Harrington, U.S. Army Aviation Development Directorate (United States)

Published in SPIE Proceedings Vol. 10197:
Degraded Environments: Sensing, Processing, and Display 2017
John (Jack) N. Sanders-Reed; Jarvis (Trey) J. Arthur III, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?