LiDAR is one of many active sensor technologies that uses electromagnetic radiation. Operating in the optical and infrared wavelengths, it is similar to more-familiar passive EO/IR sensor technology. It is also similar to radar in that it uses reflected electromagnetic radiation emitted by the sensor. LiDAR is commonly used for making high-resolution maps and has applications in geodesy, geomatics, archaeology, geography, geology, geomorphology, seismology, forestry, atmospheric physics, laser guidance, airborne laser swath mapping, and laser altimetry. It is also being used for control and navigation of some autonomous cars.
The first part of LiDAR Technologies and Systems introduces LiDAR and its history, and then covers the LiDAR range equation and the link budget (how much signal a LiDAR must emit in order to get a certain number of reflected photons back), as well as the rich phenomenology of LiDAR, which results in a diverse array of LiDAR types. The middle chapters discuss the components of a LiDAR system, including laser sources and modulators, LiDAR receivers, beam-steering approaches, and LiDAR processing. The last part covers testing, performance metrics, and significant applications, including how to build systems for some of the more popular applications.
Pages: 520
ISBN: 9781510625396
Volume: PM300
Table of Contents
- Preface
- 1 Introduction to LiDAR
- 1.1 Context of LiDAR
- 1.2 Conceptual Discussion of LiDAR
- 1.3 Terms for Active EO Sensing
- 1.4 Types of LiDARs
- 1.4.1 Some LiDARs for surface-scattering (hard) targets
- 1.4.2 Some LiDARS for volume-scattering (soft) targets
- 1.5 LiDAR Detection Modes
- 1.6 Flash LiDAR versus Scanning LiDAR
- 1.7 Eye Safety Considerations
- 1.8 Laser Safety Categories
- 1.9 Monostatic versus Bistatic LiDAR
- 1.10 Transmit/Receive Isolation
- 1.11 Major Devices in a LiDAR
- 1.11.1 Laser sources
- 1.11.2 Receivers
- 1.11.3 Apertures
- 1.12 Organization of this Book
- Problems and Solutions
- References
- 2 History of LiDAR
- 2.1 Rangefinders, Altimeters, and Designators
- 2.1.1 First steps of rangerfinders
- 2.1.2 Long-distance rangefinders
- 2.1.3 Laser altimeters
- 2.1.4 Laser designators
- 2.1.5 Obstacle avoidance applications
- 2.2 Early Coherent LiDARs
- 2.2.1 Early work at MIT Lincoln Lab
- 2.2.2 Early coherent LiDAR airborne applications
- 2.2.3 Autonomous navigation using coherent LiDAR
- 2.2.4 Atmospheric wind sensing
- 2.2.5 Laser vibrometry
- 2.2.6 Synthetic-aperture LiDAR
- 2.3 Early Space-based LiDAR
- 2.4 Flight-based Laser Vibrometers
- 2.5 Environmental LiDARs
- 2.5.1 Early steps
- 2.5.2 Multiwavelength LiDARs
- 2.5.3 LiDAR sensing in China
- 2.5.4 LiDAR sensing in Japan
- 2.6 Imaging LiDARs
- 2.6.1 Early LiDAR imaging
- 2.6.2 Imaging LiDARs for manufacturing
- 2.6.3 Range-gated imaging programs
- 2.6.4 3D LiDAR
- 2.6.5 Imaging for weapon guidance
- 2.6.6 Flash-imaging LiDAR
- 2.6.7 Mapping LiDAR
- 2.6.8 LiDARs for underwater: laser-based bathymetry
- 2.6.9 Laser micro-radar
- 2.7 History Conclusion
- References
- 3 LiDAR Range Equation
- 3.1 Introduction to the LiDAR Range Equation
- 3.2 Illuminator Beam
- 3.3 LiDAR Cross-Section
- 3.3.1 Cross-section of a corner cube
- 3.4 Link Budget Range Equation
- 3.5 Atmospheric Effects
- 3.5.1 Atmospheric scattering
- 3.5.2 Atmospheric turbulence
- 3.5.3 Aero-optical effects on LiDAR
- 3.5.4 Extended (deep) turbulence
- 3.5.5 Speckle
- Problems and Solutions
- References
- 4 Types of LiDAR
- 4.1 Direct-Detection LiDAR
- 4.1.1 1D range-only LiDAR
- 4.1.2 Tomographic imaging LiDAR
- 4.1.3 Range-gated active imaging (2D LiDAR)
- 4.1.4 3D scanning LiDAR
- 4.1.5 Flash imaging
- 4.1.6 3D mapping applications
- 4.1.7 Laser-induced breakdown spectroscopy
- 4.1.8 Laser-induced fluorescence
- 4.1.9 Active multispectral LiDAR
- 4.1.10 LiDARs using polarization as a discriminant
- 4.2 Coherent LiDAR
- 4.2.1 Laser vibration detection
- 4.2.2 Range-Doppler imaging LiDAR
- 4.2.3 Speckle imaging LiDAR
- 4.2.4 Aperture-synthesis–based LiDAR
- 4.3 Multiple-Input, Multiple-Output Active EO Sensing
- Appendix: MATLAB® program showing synthetic-aperture pupil planes and MTFs
- Problems and Solutions
- References
- 5 LiDAR Sources and Modulations
- 5.1 Laser Background Discussion
- 5.2 Laser Waveforms for LiDAR
- 5.2.1 Introduction
- 5.2.2 High time–bandwidth product waveforms
- 5.2.3 Radiofrequency modulation of a direct-detection LiDAR
- 5.2.4 Femtosecond-pulse-modulation LiDAR
- 5.2.5 Laser resonators
- 5.2.6 Three-level and four-level lasers
- 5.2.7 Laser-pumping considerations
- 5.2.8 Q-switched lasers for LiDAR
- 5.2.9 Mode-locked lasers for LiDAR
- 5.2.10 Laser seeding for LiDAR
- 5.2.11 Laser amplifier for LiDAR
- 5.3 Lasers Used in LiDAR
- 5.3.1 Diode lasers for LiDAR
- 5.4 Bulk Solid State Lasers for LiDAR
- 5.4.1 Fiber lasers for LiDAR
- 5.4.2 Nonlinear devices to change LiDAR wavelength
- 5.5 Fiber Format
- Problems and Solutions
- References
- 6 LiDAR Receivers
- 6.1 Introduction to LiDAR Receivers
- 6.2 LiDAR Signal-to-Noise Ratio
- 6.2.1 Noise probability density functions
- 6.2.2 Thermal noise
- 6.2.3 Shot noise
- 6.2.4 Background noise
- 6.2.5 Dark current, 1/f noise, and excess noise
- 6.3 Avalanche Photodiodes and Direct Detection
- 6.3.1 Linear-mode APD arrays for LiDAR
- 6.3.2 Direct-detection GMAPD LiDAR camera
- 6.4 Silicon Detectors
- 6.5 Heterodyne Detection
- 6.5.1 Temporal heterodyne detection
- 6.5.2 Heterodyne mixing efficiency
- 6.5.3 Quadrature detection
- 6.5.4 Carrier-to-noise ratio (CNR) for temporal heterodyne detection
- 6.5.5 Spatial heterodyne detection / digital holography
- 6.5.6 Receivers for coherent LiDARs
- 6.5.7 Geiger-mode APDs for coherent imaging
- 6.5.8 PIN diode or LMAPDs for coherent imaging
- 6.5.9 Sampling associated with temporal heterodyne sensing
- 6.6 Long–Frame-Time Framing Detectors for LiDAR
- 6.7 Ghost LiDARs
- 6.8 LiDAR Image Stabilization
- 6.9 Optical–Time-of-Flight Flash LiDAR
- 6.9.1 Summary of advantages and disadvantages of OTOF cameras
- Problems and Solutions
- References
- 7 LiDAR Beam Steering and Optics
- 7.1 Mechanical Beam-Steering Approaches for LiDAR
- 7.1.1 Gimbals
- 7.1.2 Fast-steering mirrors
- 7.1.3 Risley prisms and Risley gratings
- 7.1.4 Rotating polygonal mirrors
- 7.1.5 MEMS beam steering for LiDAR
- 7.1.6 Lenslet-based beam steering
- 7.2 Nonmechanical Beam-Steering Approaches for Steering LiDAR Optical Beams
- 7.2.1 OPD-based nonmechanical approaches
- 7.2.2 Chip-scale optical phased arrays
- 7.2.3 Electrowetting beam steering for LiDAR
- 7.2.4 Using electronically written lenslets for lenslet-based beam steering
- 7.2.5 Beam steering using EO effects
- 7.2.6 Phase-based nonmechanical beam steering
- 7.3 Some Optical Design Considerations for LiDAR
- 7.3.1 Geometrical optics
- 7.3.2 Adaptive optics systems
- 7.3.3 Adaptive optics elements
- Problems and Solutions
- Notes and References
- 8 LiDAR Processing
- 8.1 Introduction
- 8.2 Generating LiDAR Images/Information
- 8.2.1 Range measurement processing
- 8.2.2 Range resolution of LiDAR
- 8.2.3 Angle LiDAR processing
- 8.2.4 Gathering information from a temporally coherent LiDAR
- 8.2.5 General LiDAR processing
- 8.2.6 Target classification using LiDAR
- Problems and Solutions
- References
- 9 Figures of Merit, Testing, and Calibration for LiDAR
- 9.1 Introduction
- 9.2 LiDAR Characterization and Figures of Merit
- 9.2.1 Ideal point response main lobe width
- 9.2.2 Integrated sidelobe ratio
- 9.2.3 Peak sidelobe ratio
- 9.2.4 Spurious sidelobe ratio
- 9.2.5 Noise-equivalent vibration velocity
- 9.2.6 Ambiguity velocity
- 9.2.7 Unambiguous range
- 9.3 LiDAR Testing
- 9.3.1 Angle/angle/range resolution testing
- 9.3.2 Velocity measurement
- 9.3.3 Measuring range walk
- 9.4 LiDAR Calibration
- 9.4.1 Dark nonuniform correction
- 9.4.2 Results of correction
- Problems and Solutions
- References
- 10 LiDAR Performance Metrics
- 10.1 Image Quality Metrics
- 10.1.1 Object parameters
- 10.2 LiDAR Parameters
- 10.3 Image Parameters: National Imagery Interpretability Rating Scale (NIIRS)
- 10.4 3D Metrics for LiDAR Images
- 10.5 General Image Quality Equations
- 10.6 Quality Metrics Associated with Automatic Target Detection, Recognition, or Identification
- 10.7 Information Theory Related to Image Quality Metrics
- 10.8 Image Quality Metrics Based on Alternative Basis Sets
- 10.9 Eigenmodes
- 10.10 Compressive Sensing
- 10.10.1 Knowledge-enhanced compressive sensing
- 10.10.2 Scale-invariant feature transform
- 10.11 Machine Learning
- 10.12 Processing to Obtain Imagery
- 10.13 Range Resolutions in EO/IR Imagers
- 10.14 Current LiDAR Metric Standards
- 10.15 Conclusions
- Appendix: MATLAB code to Fourier transform an image
- Problems and Solutions
- Notes and References
- 11 Significant Applications of LiDAR
- 11.1 Auto LiDAR
- 11.1.1 Introduction
- 11.1.2 Resolution
- 11.1.3 Frame rate
- 11.1.4 Laser options
- 11.1.5 Eye safety
- 11.1.6 Unambiguous range
- 11.1.7 Required laser energy per pulse and repetition rate
- 11.1.8 Obscurants considered for auto LiDAR
- 11.1.9 Keeping the auto-LiDAR aperture clear
- 11.2 3D Mapping LiDAR
- 11.2.1 Introduction to 3D mapping LiDAR
- 11.2.2 3D mapping LiDAR design
- 11.3 Laser Vibrometers
- 11.3.1 Designing a laser vibrometer
- 11.4 Wind Sensing
- Problems and Solutions
- References
- Index
Preface
About six years ago, I co-taught a semester-long course in LiDAR technology with Dr. Ed Watson at the LiDAR and Optical Communications Institute (LOCI) of the University of Dayton. At the time, there was a book that covered part of what I wanted to teach, but it did not cover all of the areas I thought should go into the course. There were a couple of other books that had interesting LiDAR-related material, but no book that covered all of the topics that I thought were needed. Since then, I have done a number of week-long, or almost-week-long, courses. One of those I co-taught with Gary Kamerman, and a number of them with Ed Watson. Shortly after teaching that 2012 semester-long course, I started writing the Field Guide to LiDAR, which was published by SPIE Press in 2015. I thought a shorter book would be easier to write than a longer one. I was wrong. The Field Guide came out great, but its format with one topic per page made it a challenging type of book to write. Also, when I finished writing the Field Guide, I still did not have a really good text book on LiDAR technology and systems. Thus, the decision to write this book grew out of the need for a good teaching reference for a longer course on LiDAR. The Field Guide is great as a quick reference, with of all the equations in one place and each topic concisely presented, but it does not provide enough background or detail to be a text book. This book presents an in-depth coverage of LiDAR technology and systems, and the Field Guide serves as a reminder of the essential points and equations once you already understand the technology.
I learned a lot writing the Field Guide to LiDAR, and then writing this book. When I considered all of the topics that should be covered in this book, there were some I knew really well, and some I knew less well. The neat thing I found about writing a book like this is that, before I could effectively explain a particular concept, I needed to clearly understand the concept. To this end, the comparison paper I recently wrote with Ed Watson, Andrew Huntington, Dale Fried, Paul Banks, and Jeffrey D. Beck taught me a lot about receivers. I have included sections from that paper in this book. The paper on the history of laser radar in the U.S. that I wrote with Milt Huffacker and Gary Kammerman, and the more recent paper "Laser radar: historical prospective—from the East to the West," which I wrote with Vasly Molebny, Ove Steinvall, T. Kobayashi, and W. Chen, both provide a good summary of the history of LiDAR. Chairing the 2014 United States National Academy of Sciences study on laser radar helped me learn more. Of course, decades of experience monitoring LiDAR development for the Air Force taught me a lot as well.
Once I started working on this book, I had two students take a self-study course with me in LiDAR. Both students read early versions of the manuscript and developed possible problems to include at the end of each chapters. Dr. Abtin Ataei was the first student to do this, and Andrew Reinhardt was the second. I am grateful to both of them. For the last chapter on LiDAR applications, I felt I did not know the 3D mapping area as well as I should. Dr. Mohan Vaidynathan, my former postdoctorate, works for Harris Corporation (now merged with L3 Technologies) on one of the commercial 3D mappers and volunteered to make an input. Admittedly, he is an advocate of the Geiger-mode version of 3D mapping, as he should be, given where he works, but I knew that. He and his colleagues provided significant input. To balance things out, I did request information from Optec and RIEGL as well.
MIT/Lincoln Lab has a nice library of 3D LiDAR images. I would like to thank them for providing one of those images for the book cover.
Finally, I thank Dara Burrows, Senior Editor at SPIE Press, whose tireless work editing this book has made it happen.
This has been an educational experience, and I am pleased with the way the book has turned out. I hope you enjoy it, and I hope many people can use it to learn more about LiDAR technology and systems. I enjoyed writing it, and as I mentioned, learned a lot in certain areas. Perhaps, once in a while it happens that an author learns almost as much by writing a book as a reader learns by reading that book!
Paul McManamon
May 2019
© SPIE. Terms of Use
