An approximation of the Gaussian RBF kernel for efficient classification with SVMs

Journal article


Publication Details

Author(s): Ring M, Eskofier B
Journal: Pattern Recognition Letters
Publisher: Elsevier
Publication year: 2016
Volume: 84
Pages range: 107–113
ISSN: 0167-8655


Abstract


In theory, kernel support vector machines (SVMs) can be reformulated to linear SVMs. This reformulation can speed up SVM classifications considerably, in particular, if the number of support vectors is high. For the widely-used Gaussian radial basis function (RBF) kernel, however, this theoretical fact is impracticable because the reproducing kernel Hilbert space (RKHS) of this kernel has infinite dimensionality. Therefore, we derive a finite-dimensional approximative feature map, based on an orthonormal basis of the kernel’s RKHS, to enable the reformulation of Gaussian RBF SVMs to linear SVMs. We show that the error of this approximative feature map decreases with factorial growth if the approximation quality is linearly increased. Experimental evaluations demonstrated that the approximative feature map achieves considerable speed-ups (about 18-fold on average), mostly without losing classification accuracy. Therefore, the proposed approximative feature map provides an efficient SVM evaluation method with minimal loss of precision.



FAU Authors / FAU Editors

Eskofier, Björn Prof. Dr.
Stiftungs-Juniorprofessur für Sportinformatik (Digital Sports)
Ring, Matthias
Lehrstuhl für Informatik 5 (Mustererkennung)


How to cite

APA:
Ring, M., & Eskofier, B. (2016). An approximation of the Gaussian RBF kernel for efficient classification with SVMs. Pattern Recognition Letters, 84, 107–113. https://dx.doi.org/10.1016/j.patrec.2016.08.013

MLA:
Ring, Matthias, and Björn Eskofier. "An approximation of the Gaussian RBF kernel for efficient classification with SVMs." Pattern Recognition Letters 84 (2016): 107–113.

BibTeX: 

Last updated on 2018-19-04 at 03:37