Share Email Print

Proceedings Paper

Sample Estimators For Entropic Measures Of Mutual Information
Author(s): R. C. McCarty
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

A nonlinear, entropic measure of mutual information (statistical dependence), I(X1,...,Xnin ≥ 2) = ∫-∞+∞...∫-∞+∞log [(f(x1,...xn)/(Πf1(x1))]dF(x1,...,xn)≥0 was proposed in 1966 by Blachman for a set of continuous random variables, (X1,..., X,ni-∞ < Xt < +∞, i= 1,..., n; ≥ 2) with continuous distribution F(xi,...,xn).

Paper Details

Date Published: 16 December 1989
PDF: 10 pages
Proc. SPIE 0977, Real-Time Signal Processing XI, (16 December 1989); doi: 10.1117/12.948566
Show Author Affiliations
R. C. McCarty, ESL Incorporated (United States)

Published in SPIE Proceedings Vol. 0977:
Real-Time Signal Processing XI
J. P. Letellier, Editor(s)

© SPIE. Terms of Use
Back to Top