Share Email Print
cover

Proceedings Paper

Parameter optimization of LS-SVM for regression using NGA
Author(s): Qi Wang; Zhigang Feng; Katsunori Shida
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Compared with support vector machine (SVM), least squares support vector machine (LS-SVM) has overcame the shortcoming of higher computational burden by solving linear equations, and has been widely used in classification and nonlinear function estimation. But there is no efficient method for parameter selection of LS-SVM. In this paper, the sharing function based niche genetic algorithm (SNGA) is used to the parameter optimization of LS-SVM for regression. In the SNGA approach, k-folds cross validation is used to evaluate the LS-SVM generalization performance. The inverse of the average test error of the k trials is used as the fitness value. The hamming distance between each two individuals is defined as the sharing function. Two benchmark problems, SINC function regression and Henon map time series prediction are used as examples for demonstration. The results indicate that this approach can escape from the blindness of man-made choice of the LS-SVM parameters. It enhances the efficiency and the capability of regression. With little modification, this approach is also can be used to the parameter optimization of SVM or LS-SVM for classification.

Paper Details

Date Published: 30 April 2007
PDF: 8 pages
Proc. SPIE 6560, Intelligent Computing: Theory and Applications V, 65600A (30 April 2007); doi: 10.1117/12.718893
Show Author Affiliations
Qi Wang, Harbin Institute of Technology (China)
Zhigang Feng, Harbin Institute of Technology (China)
Katsunori Shida, Harbin Institute of Technology (China)


Published in SPIE Proceedings Vol. 6560:
Intelligent Computing: Theory and Applications V
Kevin L. Priddy; Emre Ertin, Editor(s)

© SPIE. Terms of Use
Back to Top