Share Email Print
cover

Proceedings Paper

Boosting and support vector machines as optimal separators
Author(s): Saharon Rosset; Ji Zhu; Trevor J. Hastie
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

In this paper we study boosting methods from a new perspective. We build on recent work by Efron et al. to show that boosting approximately (and in some cases exactly) minimizes its loss criterion with an L1 constraint. For the two most commonly used loss criteria (exponential and logistic log-likelihood), we further show that as the constraint diminishes, or equivalently as the boosting iterations proceed, the solution converges in the separable case to an “L1-optimal” separating hyper-plane. This “L1-optimal” separating hyper-plane has the property of maximizing the minimal margin of the training data, as de£ned in the boosting literature. We illustrate through examples the regularized and asymptotic behavior of the solutions to the classifcation problem with both loss criteria.

Paper Details

Date Published: 13 January 2003
PDF: 7 pages
Proc. SPIE 5010, Document Recognition and Retrieval X, (13 January 2003); doi: 10.1117/12.497492
Show Author Affiliations
Saharon Rosset, Stanford Univ. (United States)
Ji Zhu, Stanford Univ. (United States)
Trevor J. Hastie, Stanford Univ. (United States)


Published in SPIE Proceedings Vol. 5010:
Document Recognition and Retrieval X
Tapas Kanungo; Elisa H. Barney Smith; Jianying Hu; Paul B. Kantor, Editor(s)

© SPIE. Terms of Use
Back to Top