SPIE Digital Library Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
    Advertisers
SPIE Defense + Commercial Sensing 2017 | Call for Papers

Journal of Medical Imaging | Learn more

SPIE PRESS




Print PageEmail PageView PDF

Remote Sensing

Imaging data detects changes in urban areas over time

A scheme for identifying altered features of cityscapes that compares existing building models with new lidar data points and aerial images improves the accuracy of 3D spatial information.
19 May 2011, SPIE Newsroom. DOI: 10.1117/2.1201104.003539

Computer-generated building models provide 3D spatial information for a variety of applications, such as city planning, automobile-navigation systems, and spatial inquiry (e.g., Google Earth or Microsoft Bing maps). As urbanization proceeds apace worldwide, 3D geographic systems are proving invaluable in updating building models on a regular basis, in general using one of two methods. The first is to recreate the entire area under examination through a mapping procedure. The second (and preferable) method in terms of effort and cost looks only for features that have changed, and then reconstructs them. The effectiveness of this approach depends highly on the ability to identify the evolution of the built landscape over time.

Currently, changes are usually measured through color (spectral) analysis of aerial and lidar (light detection and ranging) images. But these techniques have both advantages and disadvantages in terms of horizontal and vertical accuracy. Airborne lidar data provides more accurate height information than aerial images, but boundaries are less well defined. Aerial images provide more extensive 2D structure, such as high-resolution texture and color, and 3D can be measured on the basis of one or several aerial pictures by a number of means, such as stereopairs and shapes suggested by shading. But the information extracted is still less than precise.1


Figure 1. Workflow for detecting changes in a 3D landscape by combining old and new technologies. Lidar: Light detection and ranging.
Table 1. Performance comparison between model-updating strategies. Kappa value: Proportion of agreement after chance agreement is removed from consideration.
Single thresholdingDouble thresholding
Overall accuracy0.9310.959
Producer's accuracy0.8740.937
User's accuracy0.7790.852
Kappa value0.7140.829

Recently, a number of new methods for detecting changes using lidar have been proposed.2 Earlier studies used vector maps,3 lidar,4 aerial imagery,5 or 3D building models as the initial comparison set.6 We sought to integrate lidar data and aerial images into a single system. Here, we describe work aimed at detecting changes in a landscape by comparing old 3D building models with newly obtained lidar and aerial imagery.7 Figure 1 shows the workflow of the method.

To compare images taken at different times, we first register the lidar data, aerial images, and building models to the same scale. We then determine any alterations by examining spectral information from aerial images, height differences between the lidar points and the building models, and linear features of the aerial images. Spectral information helps to locate vegetation to exclude areas that contain no manmade structures. Next, we use lidar to detect the points that represent the roof planes of buildings. The height differences between these points and the building models become our primary indicators of new building features. We use line features of the aerial images for further refinement.


Figure 2. The test data set: Hsin-Chu City, northern Taiwan. Top: Three-dimensional building model. Bottom: Aerial image (left) and lidar data (right). The gray scale of the lidar data indicates height, with black lowest and white highest.

Based on our comparison of data, images, and models, we categorize our findings as ‘unchanged,’ ‘main-structure changed,’ or ‘microstructure alterations.’ Since height difference is the major indicator of variation, we employ a double-thresholding strategy—e.g., setting an upper bound of 3m and a lower bound of 1m—to detect obviously changed and unchanged areas in buildings. The line-feature comparisons help to further identify the areas of interest (the data set between the two thresholds), which are then singled out for additional study.

Figure 2 shows our test data, including (top) old building models, (bottom left) a new aerial image, and (bottom right) lidar data over an area of Hsin-Chu city in northern Taiwan. The old building models are polyhedral in shape and built from a pair of 2002 stereoscopic images using a photogrammetric technique that included a total of 492 buildings. The new image, with 12cm resolution, and lidar data with 1.7 points per square meter density were acquired in 2005. The proposed scheme then detects the changes to the old buildings using the new image and lidar data sets. Table 1 details the performance of the proposed double-thresholding strategy compared with the traditional single-thresholding approach. It is obvious that the new strategy offers higher accuracy in all of the common indices.

In summary, we have proposed a new scheme for detecting building changes in an urban environment by comparing existing building models with new lidar data points and aerial images. We also use a double-thresholding strategy to improve detection accuracy. Some detection errors may be explained by registration inaccuracies and tiny roof variations. Improving registration and the accuracy of the roof models could in turn lead to better results. One limitation to our approach is areas where buildings are occluded by a canopy of vegetation.

Our objectives for the next phase of this work focus on improving our results. First, we plan to detect not only changes from existing models but also those for newer structures in the test area. Second, we would like to experiment with registering the detailed data model by model instead of all at once to improve the accuracy of detection. Finally, we will include more detailed information about the structures under study.


Liang-Chien Chen, Li-Jer Lin, Wen-Chi Chang
National Central University
Jhongli, Taiwan

Liang-Chien Chen received his PhD from the University of Illinois, Urbana (1985). He was at the Institute of Photogrammetry, National Cheng Kung University, Taiwan, from 1985 to 1986. Since 1986, he has been a professor at National Central University's Center for Space and Remote Sensing Research, which he also directs. His research activities are focused on digital photogrammetry, geometrical data processing for remotely sensed data, image feature extraction, lidar processing, and terrain analysis.

Li-Jer Lin received his MSE (2010) from the Department of Civil Engineering. He is currently in military service.

Wen-Chi Chang received her MSE (2009) from the Department of Civil Engineering, National Central University. In 2009 she joined the Center for Space and Remote Sensing Research as a research assistant.


References:
1. D. H. Lee, K. M. Lee, S. U. Lee, Fusion of lidar and imagery for reliable building extraction,  Photogramm. Eng. Remote Sens. 74, no. 2, pp. 215-225, 2008.
2. H. Murakami, K. Nakagawa, H. Hasegawa, T. Shibata, Change detection of buildings using an airborne laser scanner, ISPRS J. Photogramm. Remote Sens. 54, pp. 148-152, 1999. doi:10.1016/S0924-2716(99)00006-4
3. T. Knudsen, B. P. Olsen, Automated change detection for updates of digital map databases, Photogramm. Eng. Remote Sens. 69, no. 11, pp. 1289-1296, 2003.
4. D. Girardeau-Montaut, M. Roux, R. Marc, G. Thibault, Change detection on points cloud data acquired with a ground laser scanner, ISPRS Laser Scan. Workshop, pp. 30-35, 2005.
5. F. Jung, Detecting building changes from multitemporal aerial stereopairs, ISPRS J. Photogramm. Remote Sens. 58, pp. 187-201, 2004. doi:10.1016/j.isprsjprs.2003.09.005
6. C. Y. Huang, The integration of shape and spectral information for change detection of building models, Master's thesis. National Central University, Taiwan, 2008. In Chinese. 
7. L.-C. Chen, L.-J. Lin, Detection of building changes from aerial images and light detecting and ranging (LIDAR) data, J. Appl. Remote Sens. 4, pp. 041870, 2010. doi:10.1117/1.3525560