Kernel Error Path Algorithm

Document Type

Article

Publication Title

IEEE Transactions on Neural Networks and Learning Systems

Abstract

Tuning the values of kernel parameters plays a vital role in the performance of kernel methods. Kernel path algorithms have been proposed for several important learning algorithms, including support vector machine and kernelized Lasso, which can fit the piecewise nonlinear solutions of kernel methods with respect to the kernel parameter in a continuous space. Although the error path algorithms have been proposed to ensure that the model with the minimum cross validation (CV) error can be found, which is usually the ultimate goal of model selection, they are limited to piecewise linear solution paths. To address this problem, in this article, we extend the classic error path algorithm to the nonlinear kernel solution paths and propose a new kernel error path algorithm (KEP) that can find the global optimal kernel parameter with the minimum CV error. Specifically, we first prove that error functions of binary classification and regression problems are piecewise constant or smooth w.r.t. the kernel parameter. Then, we propose KEP for support vector machine and kernelized Lasso and prove that it guarantees to find the model with the minimum CV error within the whole range of kernel parameter values. Experimental results on various datasets show that our KEP can find the model with minimum CV error with less time consumption. Finally, it would have better generalization error on the test set, compared with grid search and random search.

DOI

10.1109/TNNLS.2022.3153953

Publication Date

3-11-2022

Keywords

Approximation algorithms, Computational modeling, Cross validation (CV), error path, Kernel, kernel path (KP), Machine learning algorithms, model selection., Support vector machines, Training, Tuning

Comments

IR deposit conditions:

  • OA (Accepted version) - pathway a
  • No embargo
  • When accepted for publication, set statement to accompany deposit (see policy)
  • Must link to publisher version with DOI
  • Publisher copyright and source must be acknowledged

Share

COinS