Share Email Print

Proceedings Paper

Paralleled Laplacian of Gaussian (LoG) edge detection algorithm by using GPU
Author(s): Weibin Wu
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Laplacian of Gaussian (LoG) filter is a very conventional and effective edge detector which is used in edge detection. In the image denoising phase, we implemented the parallel method of Gaussian blur to the image so that we can get rid of the impact brought by the original image, and prevent the noise being amplified by Laplace operator. Then, in the phase of edge detection, the Laplace operator was applied to the result which has been processed and exported by the first phase. Through the optimization of these steps, the running performance will make a big difference compared to the pure Laplacian. Combining with the highly evolved Graphics Processing Unit (GPU), the way of parallel image processing will be more effective than the serial one. In this study, the parallel LoG Algorithm was implemented in different size images on NVIDIA GPU using Compute Unified Device Architecture (CUDA). By applying the parallel LoG algorithm suggested here, the time required to the process of edge detection would be narrowed down immensely and achieve speed up by factor 3.7x compared to the serial application running on CPU.

Paper Details

Date Published: 29 August 2016
PDF: 5 pages
Proc. SPIE 10033, Eighth International Conference on Digital Image Processing (ICDIP 2016), 1003309 (29 August 2016); doi: 10.1117/12.2244599
Show Author Affiliations
Weibin Wu, Beijing Institute of Technology (China)

Published in SPIE Proceedings Vol. 10033:
Eighth International Conference on Digital Image Processing (ICDIP 2016)
Charles M. Falco; Xudong Jiang, Editor(s)

© SPIE. Terms of Use
Back to Top