Share Email Print
cover

Proceedings Paper

Modeling images and textures by minimax entropy
Author(s): Song-Chun Zhu; YingNian Wu; David Mumford
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

This article proposes a general theory and methodology, called the minimax entropy principle, for building statistical models for images (or signals) in a variety of applications. This principle consists of two parts. (1) Maximum entropy principle for feature binding (or fusion): for a given set of observed feature statistics, a distribution can be built to bind these feature statistics together by maximizing the entropy over all distributions that reproduce these feature statistics. The second part is the minimum entropy principle for feature selection: among all plausible sets of feature statistics, we choose the set whose maximum entropy distribution has the minimum entropy. Computational and inferential issues in both parts are addressed. The minimax entropy principle is then corrected by considering the sample variation in the observed feature statistics, and a novel information criterion is derived for feature selection. The minimax entropy principle is applied to texture modeling. Relationship between our theory and the mechanisms of neural computation is also discussed.

Paper Details

Date Published: 3 June 1997
PDF: 14 pages
Proc. SPIE 3016, Human Vision and Electronic Imaging II, (3 June 1997); doi: 10.1117/12.274536
Show Author Affiliations
Song-Chun Zhu, Brown Univ. (United States)
YingNian Wu, Univ. of Michigan (United States)
David Mumford, Brown Univ. (United States)


Published in SPIE Proceedings Vol. 3016:
Human Vision and Electronic Imaging II
Bernice E. Rogowitz; Thrasyvoulos N. Pappas, Editor(s)

© SPIE. Terms of Use
Back to Top