Loading...
Thumbnail Image
Item

A new algorithm for support vector regression with automatic selection of hyperparameters

Wang, You-Gan
Wu, Jinran
Hu, Zhi-Hua
McLachlan, Geoffrey J.
Citations
Google Scholar:
Altmetric:
Abstract
The hyperparameters in support vector regression (SVR) determine the effectiveness of the support vectors with fitting and predictions. However, the choice of these hyperparameters has always been challenging in both theory and practice. The ν -support vector regression eliminates the need to specify an ϵ value elegantly, but at the cost of specifying or postulating a ν value. We propose an extended primal objective function arising from probability regularization leading to an automatic selection of ϵ , and we can express ν as an explicit function of ϵ . The resultant hyperparameter values can be interpreted as ‘working’ values required only in training but not testing or prediction. This regularized algorithm, namely ϵ∗ -SVR, automatically provides a data-dependent ϵ and is found to have a close connection to the ν -support vector regression in the sense that ν as a fraction is a sensible function of ϵ . The ϵ∗ -SVR automatically selects both ν and ϵ values. We illustrate these findings with some public benchmark datasets.
Keywords
automatic selection, loss functions, noise models, parameter estimation, probability regularization
Date
2023
Type
Journal article
Journal
Pattern Recognition
Book
Volume
133
Issue
Page Range
1-9
Article Number
Article 108989
ACU Department
Institute for Learning Sciences and Teacher Education (ILSTE)
Faculty of Education and Arts
Institute for Positive Psychology and Education
Relation URI
Source URL
Event URL
Open Access Status
License
All rights reserved
File Access
Controlled
Notes