Share Email Print
cover

Proceedings Paper

Time-gated topographic LIDAR scene simulation
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model has been developed at the Rochester Institute of Technology (RIT) for over a decade. The model is an established, first-principles based scene simulation tool that has been focused on passive multi- and hyper-spectral sensing from the visible to long wave infrared (0.4 to 14 μm). Leveraging photon mapping techniques utilized by the computer graphics community, a first-principles based elastic Light Detection and Ranging (LIDAR) model was incorporated into the passive radiometry framework so that the model calculates arbitrary, time-gated radiances reaching the sensor for both the atmospheric and topographic returns. The active LIDAR module handles a wide variety of complicated scene geometries, a diverse set of surface and participating media optical characteristics, multiple bounce and multiple scattering effects, and a flexible suite of sensor models. This paper will present the numerical approaches employed to predict sensor reaching radiances and comparisons with analytically predicted results. Representative data sets generated by the DIRSIG model for a topographical LIDAR will be shown. Additionally, the results from phenomenological case studies including standard terrain topography, forest canopy penetration, and camouflaged hard targets will be presented.

Paper Details

Date Published: 19 May 2005
PDF: 12 pages
Proc. SPIE 5791, Laser Radar Technology and Applications X, (19 May 2005); doi: 10.1117/12.604326
Show Author Affiliations
Scott D. Brown, Rochester Institute of Technology (United States)
Daniel D. Blevins, U.S. Air Force (United States)
Rochester Institute of Technology (United States)
John R. Schott, Rochester Institute of Technology (United States)


Published in SPIE Proceedings Vol. 5791:
Laser Radar Technology and Applications X
Gary W. Kamerman, Editor(s)

© SPIE. Terms of Use
Back to Top