Holger Fröhlich and Andreas Zell
Efficient Parameter Selection for Support Vector Machines in Classification and Regression via Model-Based Global Optimization
Abstract
Support Vector Machines (SVMs) have become one of the most popular
methods in Machine Learning during the last years. A special strength
is the use of a kernel function to introduce nonlinearity and to deal
with arbitrarily structured data. Usually the kernel function depends
on certain parameters, which, together with other parameters of the
SVM, have to be tuned to achieve good results. However, finding good
parameters can become a real computational burden as the number of
parameters and the size of the dataset increases. In this paper we
propose an algorithm to deal with the model selection problem, which
is based on the idea of learning an Online Gaussian Process model
of the error surface in parameter space and sampling systematically
at points for which the so called \emph{expected improvement} is highest.
Our experiments show that on this way we can find good parameters
very efficiently.
Download
[pdf]
BibTeX
@conference{Froe05ParamSelSVM,
AUTHOR="H. Fröhlich and A. Zell",
TITLE="{Efficient Parameter Selection for Support Vector Machines in Classification and Regression via Model-Based Global Optimization}",
BOOKTITLE="Proc. Int. Joint Conf. on Neural Networks",
PAGES="1431 - 1438",
YEAR=2005
}
pre>
body>