Share Email Print

Proceedings Paper

Unsupervised optimization of support vector machine parameters
Author(s): Mary L. Cassabaum; Donald E. Waagen; Jeffrey J. Rodriguez; Harry A. Schmitt
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Selection of the kernel parameters is critical to the performance of Support Vector Machines (SVMs), directly impacting the generalization and classification efficacy of the SVM. An automated procedure for parameter selection is clearly desirable given the intractable problem of exhaustive search methods. The authors' previous work in this area involved analyzing the SVM training data margin distributions for a Gaussian kernel in order to guide the kernel parameter selection process. The approach entailed several iterations of training the SVM in order to minimize the number of support vectors. Our continued investigation of unsupervised kernel parameter selection has led to a scheme employing selection of the parameters before training occurs. Statistical methods are applied to the Gram matrix to determine kernel optimization in an unsupervised fashion. This preprocessing framework removes the requirement for iterative SVM training. Empirical results will be presented for the "toy" checkerboard and quadboard problems.

Paper Details

Date Published: 21 September 2004
PDF: 10 pages
Proc. SPIE 5426, Automatic Target Recognition XIV, (21 September 2004); doi: 10.1117/12.542422
Show Author Affiliations
Mary L. Cassabaum, Raytheon Missile Systems Co. (United States)
Donald E. Waagen, Raytheon Missile Systems Co. (United States)
Jeffrey J. Rodriguez, Univ. of Arizona (United States)
Harry A. Schmitt, Raytheon Missile Systems Co. (United States)

Published in SPIE Proceedings Vol. 5426:
Automatic Target Recognition XIV
Firooz A. Sadjadi, Editor(s)

© SPIE. Terms of Use
Back to Top