Share Email Print
cover

Proceedings Paper

Probabilistic inequalities with applications to machine learning
Author(s): Xinjia Chen
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

We propose a new approach for deriving probabilistic inequalities based on bounding likelihood ratios. We demonstrate that this approach is more general and powerful than the classical method frequently used for deriving concentration inequalities such as Chernoff bounds. We discover that the proposed approach is inherently related to statistical concepts such as monotone likelihood ratio, maximum likelihood, and the method of moments for parameter estimation. A connection between the proposed approach and the large deviation theory is also established. We show that, without using moment generating functions, tightest possible concentration inequalities may be readily derived by the proposed approach. The applications of the new probabilistic techniques to statistical machine learning theory are demonstrated.

Paper Details

Date Published: 22 May 2014
PDF: 12 pages
Proc. SPIE 9118, Independent Component Analyses, Compressive Sampling, Wavelets, Neural Net, Biosystems, and Nanoengineering XII, 91180R (22 May 2014); doi: 10.1117/12.2049982
Show Author Affiliations
Xinjia Chen, Southern Univ. and A&M College (United States)


Published in SPIE Proceedings Vol. 9118:
Independent Component Analyses, Compressive Sampling, Wavelets, Neural Net, Biosystems, and Nanoengineering XII
Harold H. Szu; Liyi Dai, Editor(s)

© SPIE. Terms of Use
Back to Top