Share Email Print

Proceedings Paper

Boosting and support vector machines as optimal separators
Author(s): Saharon Rosset; Ji Zhu; Trevor J. Hastie
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

In this paper we study boosting methods from a new perspective. We build on recent work by Efron et al. to show that boosting approximately (and in some cases exactly) minimizes its loss criterion with an L1 constraint. For the two most commonly used loss criteria (exponential and logistic log-likelihood), we further show that as the constraint diminishes, or equivalently as the boosting iterations proceed, the solution converges in the separable case to an “L1-optimal” separating hyper-plane. This “L1-optimal” separating hyper-plane has the property of maximizing the minimal margin of the training data, as de£ned in the boosting literature. We illustrate through examples the regularized and asymptotic behavior of the solutions to the classifcation problem with both loss criteria.

Paper Details

Date Published: 13 January 2003
PDF: 7 pages
Proc. SPIE 5010, Document Recognition and Retrieval X, (13 January 2003); doi: 10.1117/12.497492
Show Author Affiliations
Saharon Rosset, Stanford Univ. (United States)
Ji Zhu, Stanford Univ. (United States)
Trevor J. Hastie, Stanford Univ. (United States)

Published in SPIE Proceedings Vol. 5010:
Document Recognition and Retrieval X
Tapas Kanungo; Elisa H. Barney Smith; Jianying Hu; Paul B. Kantor, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?