Efficient Regularized Least Squares Classification

Abstract

Kernel-based regularized least squares (RLS) algorithms are a promising technique for classification. RLS minimizes a regularized functional directly in a reproducing kernel Hilbert space defined by a kernel. In contrast, support vector machines (SVMs) implement the structure risk minimization principle and use the kernel trick to extend it to the nonlinear case. While both have a sound mathematical foundation, RLS is strikingly simple. On the other hand, SVMs in general have a sparse representation of the solution. In this paper, we introduce a very fast version of the RLS algorithm while maintaining the achievable level of performance. The proposed new algorithm computes solutions in O(m) time and O(1) space, where m is the number of training points. We demonstrate the efficacy of our very fast RLS algorithm using a number of (both real simulated) data sets.

Cite

Text

Zhang and Peng. "Efficient Regularized Least Squares Classification." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2004. doi:10.1109/CVPR.2004.331

Markdown

[Zhang and Peng. "Efficient Regularized Least Squares Classification." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2004.](https://mlanthology.org/cvprw/2004/zhang2004cvprw-efficient/) doi:10.1109/CVPR.2004.331

BibTeX

@inproceedings{zhang2004cvprw-efficient,
  title     = {{Efficient Regularized Least Squares Classification}},
  author    = {Zhang, Peng and Peng, Jing},
  booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
  year      = {2004},
  pages     = {98},
  doi       = {10.1109/CVPR.2004.331},
  url       = {https://mlanthology.org/cvprw/2004/zhang2004cvprw-efficient/}
}