Share Email Print

Proceedings Paper

Mutual information and estimative consistency
Author(s): R. C. McCarty
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

The high speed processing of sample data from continuous stochastic signal processes, such as broad-band, noise-like spread spectrum signals, provides highly correlated, stochastically-dependent sample information. The utilization of dependent sample data in the estimation of the parameters of the parent process, generally provides biased but more importantly, inconsistent parametric estimates. For a set of continuous random variables, {X1, , Xn; -oo < Xi < oo, i = 1, . . . , n; n > 2}, Blachmanl, circa 1966, proposed a two-dimensional vector measure of mutual information, I0 , which was easily extended to the general n-dimensional case. In 1988, the author of this paper proposed a consistent sample estimate of an extended Blachman measure of mutual information for the case of multivariate exponential-type probability distributions. A more generalized estimated sample measure of mutual information provides a means to determine an appropriate sample interval between adjacent temporal samples which will not vanish unless the samples are statistically independent. These samples can then be used to generate the usual statistically consistent process moment estimates.

Paper Details

Date Published: 6 December 1989
PDF: 6 pages
Proc. SPIE 1154, Real-Time Signal Processing XII, (6 December 1989); doi: 10.1117/12.962378
Show Author Affiliations
R. C. McCarty, ARGO Systems Inc. (United States)

Published in SPIE Proceedings Vol. 1154:
Real-Time Signal Processing XII
J. P. Letellier, Editor(s)

© SPIE. Terms of Use
Back to Top