Share Email Print
cover

Proceedings Paper

Real-time rendering of optical effects using spatial convolution
Author(s): Przemyslaw Rokita
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Simulation of special effects such as: defocus effect, depth-of-field effect, raindrops or water film falling on the windshield, may be very useful in visual simulators and in all computer graphics applications that need realistic images of outdoor scenery. Those effects are especially important in rendering poor visibility conditions in flight and driving simulators, but can also be applied, for example, in composing computer graphics and video sequences- -i.e. in Augmented Reality systems. This paper proposes a new approach to the rendering of those optical effects by iterative adaptive filtering using spatial convolution. The advantage of this solution is that the adaptive convolution can be done in real-time by existing hardware. Optical effects mentioned above can be introduced into the image computed using conventional camera model by applying to the intensity of each pixel the convolution filter having an appropriate point spread function. The algorithms described in this paper can be easily implemented int the visualization pipeline--the final effect may be obtained by iterative filtering using a single hardware convolution filter or with the pipeline composed of identical 3 X 3 filters placed as the stages of this pipeline. Another advantage of the proposed solution is that the extension based on proposed algorithm can be added to the existing rendering systems as a final stage of the visualization pipeline.

Paper Details

Date Published: 12 March 1998
PDF: 9 pages
Proc. SPIE 3303, Real-Time Imaging III, (12 March 1998); doi: 10.1117/12.302418
Show Author Affiliations
Przemyslaw Rokita, Warsaw Univ. of Technology (Poland)


Published in SPIE Proceedings Vol. 3303:
Real-Time Imaging III
Divyendu Sinha, Editor(s)

© SPIE. Terms of Use
Back to Top