- Biomedical Optics & Medical Imaging
- Defense & Security
- Electronic Imaging & Signal Processing
- Illumination & Displays
- Lasers & Sources
- Micro/Nano Lithography
- Optical Design & Engineering
- Optoelectronics & Communications
- Remote Sensing
- Sensing & Measurement
- Solar & Alternative Energy
- Sign up for Newsroom E-Alerts
- Information for:
Defense & Security
Airborne 3D imaging possibilities for defense and security
Several multi-rotor unmanned aerial vehicles are used to demonstrate the potential applications of 3D imaging ladar systems.
17 May 2016, SPIE Newsroom. DOI: 10.1117/2.1201604.006175
The technical development of unmanned aerial vehicle (UAV) systems, as well as the miniaturization of sensor systems, means that it is now possible to carry advanced sensors on relatively small UAVs. Rapid development of small UAVs with higher quality propulsion systems, inertial navigation systems, electronics, and algorithms is also ongoing. Concurrently, ladar (i.e., laser detection and ranging) sensors that have high reliability, accuracy, pulse repetition frequency, large data capacities, and advanced data analysis techniques are being produced. This drive for the miniaturization of ladar and UAV systems provides new possibilities for 3D imaging from the air. Indeed, there has been a tremendous development of low-cost inertial systems1 and global positioning systems (GPS) over the last decade. The absolute accuracy for a low-cost GPS today is about ±5m. This work has often been motivated by the desire for civilian applications of low-cost UAVs,2 but it has also opened up new possibilities for the use of UAVs in military and security applications.3, 4
Military and security applications of low-cost UAVs include intelligence gathering and surveillance data collection, before, during, and after an operation. It is common for UAVs to carry visible-wavelength video cameras, as well as long-wavelength IR cameras in some cases. With these imaging methods, however, it can be difficult to detect threats that are hidden in the terrain, in shadows, or behind a window that is opaque to such wavelengths. Active 3D imaging methods can be used to increase the chance of fully surveying these types of scenes. Furthermore, the 3D information that is obtained can be used to support the detection of threats and obstacles (e.g., masts and wires) in low-visibility conditions.5, 6
For the last few years, our team at the Swedish Defence Research Agency (FOI) have been working on the use of different multi-rotor UAVs (see Figure 1) for research and development purposes.7,8 The aim of our work is to demonstrate the possibilities for airborne sensor systems, especially 3D imaging ladar. Our efforts have been focused on system integration, development of 3D data processing techniques, fusion of 3D data with data from visual and thermal cameras, and on conducting experiments. The UAVs we work with are commercial off-the-shelf products so we are able to rapidly deploy them and collect high-resolution data. One of our total UAV systems typically costs much less than that of a manned airborne sensor system. In addition, an advantage of using multi-rotor UAVs is that they can hover in a particular location if a very high-density data point is desired. From our experience, we find that it is faster to collect data over an area with an airborne sensor than by scanning from movable tripod mounted systems. With UAVs we can cover larger survey areas, and detect other objects or regions of interest (e.g., those that are obscured by high levels of vegetation), than we can with systems based on ground vehicles.
Figure 1. Four multi-rotor Swedish Defence Research Agency (FOI) unmanned aerial vehicles (UAVs) used for research and development purposes in this work.
In our work we have also been investigating the potential for future UAV systems that carry ladar sensors. In these studies, we have focused on sensor concepts for one small UAV and one large UAV.9 Our sensor concepts are based on published performances of state-of-the-art commercial sensors and expected sensor performances. With the use of these sensor concepts and their expected performances, we can study efficient realtime signal and image processing, as well as data storage characteristics. Our ultimate goal is to achieve realtime delivery of easy-to-interpret information to the end user (see Figure 2).
Figure 2. Illustration of the signal processing approach for 3D data that is collected from a UAV.
Our small UAV concept is equipped with a laser scanner and has a total weight of 5–10kg. It is very similar to the UAVs that FOI uses for other research purposes and therefore gives us the opportunity to implement and analyze the proposed signal processing chain in real situations. In particular, we have evaluated which algorithms should be run on board the UAV and which would be better suited for ground station operation. The focus of this part of our work was on 3D sensing with state-of-the-art sensors, signal and image processing of the data, and data fusion. In contrast, our proposed large UAV concept has a total weight of about 150kg and includes a new type of ladar with a photon-counting detector. This type of sensor is less mature and our work is thus concentrated on developing algorithms that will be used to create point clouds. This sensor produces high noise levels at high data rates, and therefore computationally efficient noise reduction and signal characterization are required. We will use the simulated data to evaluate different signal processing concepts for this system.
We have also conducted a series of user tests with the Swedish Armed Forces (SwAF). We conducted interviews and experiments to obtain a relevant understanding of the types of 3D data processing that should be performed with our systems. We will also use the information collected to help plan how our imaging results should be presented to the user so that their chances of making correct decisions in challenging situations can be maximized (see Figure 3). Testing 3D maps in SwAF mission-integrated simulations provides fast feedback on the real demands on a 3D model (created from sensor data) and on visualization. These simulations will also allow us to close the loop between sensors/UAVs and combat effectiveness.
Figure 3. Top: Image of the interior of a building. The data for this image was collected from a vehicle-mounted Riegl VZ400 instrument (i.e., the measurement was performed from outside the building). Bottom: Analysis of the viewshed (i.e., area visible) from the location marked with the black dot. Green areas indicate a free line of sight, yellow areas have a partial line of sight, and the red areas have no line of sight. Data obtained from www.swescan.se.
We have been investigating the 3D sensing and imaging capabilities for ladar sensors carried on board UAVs. In our work, we consider a variety of sensor concepts, UAV platforms, and signal processing techniques. We have also conducted user studies in collaboration with the Swedish Armed Forces. The next stages of our research will include improvements to positioning data and speeding up the processing chain for data from the laser scanner. We will also be investigating algorithms that produce point clouds for use in the photon-counting sensor.
This work was funded by the Research and Development program of the Swedish Armed Forces and the Swedish Defence Material Administration. We also acknowledge the students, soldiers, and others who have participated in our user tests.
Swedish Defence Research Agency (FOI)
Christina Grönwall received her PhD in 2006 and is now a deputy research director and the scientific leader for sensor informatics and multi-sensor fusion at FOI. Since 2014 she has also been an adjunct associate professor at Linköping University.
H. Chao, C. Coopmans, L. Di, Y. Q. Chen, A comparative evaluation of low-cost IMUs for unmanned autonomous systems, Proc. IEEE Conf. Multisens. Fusion Integ. Intell. Syst.
, p. 211-216, 2010. doi:10.1109/MFI.2010.5604460
A. Jaakkola, J. Hyyppä, A. Kukko, X. Yu, H. Kaartinen, M. Lehtomäki, Y. Lin, A low-cost multi-sensoral mobile mapping system and its feasibility for tree measurements, ISPRS J. Photogramm. Remote Sens.
65, p. 514-522, 2010.
A. Bird, S. A. Anderson, M. Wojcik, S. E. Budge, Small SWAP 3D imaging flash ladar for small tactical unmanned air systems, Proc. SPIE
9460, p. 946008, 2015. doi:10.1117/12.2177222
F. Nex, F. Remondino, UAV for 3D mapping applications: a review, Appl. Geomat.
6, p. 1-15, 2014.
G. Conte, A. Kleiner, P. Rudol, K. Korwel, M. Wzorek, P. Doherty, Performance evaluation of a light-weight multi-echo lidar for unmanned rotorcraft applications, Proc. Int'l Arch. Photogramm. Remote Sens. Spatial Inf. Sci. XL-1/W2
XL-1/W2, p. 87-92, 2013.
M. Nieuwenhuisen, D. Droeschel, J. Schneider, D. Holz, T. Läbe, S. Behnke, Multimodal obstacle detection and collision avoidance for micro aerial vehicles, Euro. Conf. Mobile Robots
, p. 7-12, 2013. doi:10.1109/ECMR.2013.6698812
H. M. Tulldahl, H. Larsson, Lidar on small UAV for 3D mapping, Proc. SPIE
9250, p. 925009, 2014. doi:10.1117/12.2068448
H. M. Tulldahl, F. Bissmarck, H. Larsson, C. Grönwall, G. Tolt, Accuracy evaluation of 3D lidar data from small UAV, Proc. SPIE
9649, p. 964903, 2015. doi:10.1117/12.2194508
C. Grönwall, G. Tolt, P. Lif, H. Larsson, F. Bissmarck, M. Tulldahl, M. Henriksson, P. Wikberg, M. Thorstensson, 3D sensing and imaging for UAVs, Proc. SPIE
9649, p. 96490C, 2015. doi:10.1117/12.2192834