One of the problems with using complex kernels with support vector machines is that they tend to produce classification boundaries that are odd, like the ones below. (I generated them using a java SVM applet from here , whose reliability I cannot swear to, but have no reason to doubt.) Both SVM boundaries are with Gaussian RBF kernels: the first with $latex \sigma = 1$ and the second with $latex \sigma = 10$ on two different data sets. Note the segments of the boundary to the east of the blue examples in the bottom figure, and those to the south and to the north-east of the blue examples in the top figure. They seem to violate intuition. The reason for these anomalous boundaries is of course the large complexity of the function class induced by the RBF kernel with large $latex \sigma$, which gives the classifier a propensity to make subtle distinctions even in regions of somewhat low example density. A possible solution: using complex kernels only where they are needed We propose to b...
Comments
Post a Comment