Share Email Print

Proceedings Paper

Some relationships between minimum Bayes error and information theoretical feature extraction
Author(s): Manuela Vasconcelos; Nuno Vasconcelos
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Feature extraction and selection are important problems in statistical learning. We study the relationships between two previously proposed principles for their optimal solution: the minimization of Bayes error and the maximization of mutual information between features and class labels. It is shown that a quantity which provides insight on this relationship is the set of non-increasing probability mass functions (NIPMFs). We derive some basic properties of the members of this set, show that any classification problem defines an ensemble of NIPMFs, and that the probability distribution of this ensemble uniquely determines the associated Bayes error and mutual information. These results are then used to show that, when the classification problem is binary and some generic constraints hold, the optimal feature space is the same under the two formulations.

Paper Details

Date Published: 19 May 2005
PDF: 12 pages
Proc. SPIE 5807, Automatic Target Recognition XV, (19 May 2005); doi: 10.1117/12.604289
Show Author Affiliations
Manuela Vasconcelos, Harvard Univ. (United States)
Nuno Vasconcelos, Univ. of California/San Diego (United States)

Published in SPIE Proceedings Vol. 5807:
Automatic Target Recognition XV
Firooz A. Sadjadi, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?