ID | 19614 |
Eprint ID | 19614
|
FullText URL | |
Author |
Lee Wan-Jui
Yang Chih-Cheng
Lee Shie-Jue
|
Abstract | In this paper, we propose a method to select support vectors to improve the performance of support vector regression machines. First, the orthogonal least-squares
method is adopted to evaluate the support vectors based on their error reduction ratios. By selecting the representative support vectors, we can obtain a simpler model which helps avoid the over-fitting problem. Second, the simplified model is further refined by applying the gradient descent method to tune the parameters of the kernel functions. Learning rules for minimizing the regularized risk functional are derived. Experimental results have shown that our approach can improve effectively the generalization capability of support vector regressors.
|
Keywords | Orthogonal least-squares
over-fitting
gradient descent
learning rules
error reduction ratio
mean square error
|
Published Date | 2009-11-10
|
Publication Title |
Proceedings : Fifth International Workshop on Computational Intelligence & Applications
|
Volume | volume2009
|
Issue | issue1
|
Publisher | IEEE SMC Hiroshima Chapter
|
Start Page | 18
|
End Page | 23
|
ISSN | 1883-3977
|
NCID | BB00577064
|
Content Type |
Conference Paper
|
language |
English
|
Copyright Holders | IEEE SMC Hiroshima Chapter
|
Event Title | 5th International Workshop on Computational Intelligence & Applications IEEE SMC Hiroshima Chapter : IWCIA 2009
|
Event Location | 東広島市
|
Event Location Alternative | Higashi-Hiroshima City
|
File Version | publisher
|
Refereed |
True
|
Eprints Journal Name | IWCIA
|