Share Email Print

Proceedings Paper

Principles of covariance propagation
Author(s): Robert M. Haralick
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

This paper describes how to propagate approximately additive random perturbations through any kind of vision algorithm step in which the appropriate random perturbation model for the estimated quantity produced by the vision step is also an additive random perturbation. We assume that the vision algorithm step can be modeled as a calculation (linear or non- linear) that produces an estimate that minimizes an implicit scalar function of the input quantity and the calculated estimate. The only assumption is that the scalar function be non-negative, have finite first and second partial derivatives, that its value is zero for ideal data, and that the random perturbations are small enough so that the relationship between the scalar function evaluated at the ideal but unknown input and output quantities and evaluated at the observed input quantity and perturbed output quantity can be approximated sufficiently well by a first order Taylor series expansion. The paper finally discusses the issues of verifying that the derived statistical behavior agrees with the experimentally observed statistical behavior.

Paper Details

Date Published: 23 September 1999
PDF: 18 pages
Proc. SPIE 3811, Vision Geometry VIII, (23 September 1999); doi: 10.1117/12.364086
Show Author Affiliations
Robert M. Haralick, Univ. of Washington (United States)

Published in SPIE Proceedings Vol. 3811:
Vision Geometry VIII
Longin Jan Latecki; Robert A. Melter; David M. Mount; Angela Y. Wu, Editor(s)

© SPIE. Terms of Use
Back to Top