A Second-Order Perceptron Algorithm
Abstract
We introduce a variant of the Perceptron algorithm called second-order Perceptron algorithm, which is able to exploit certain spectral properties of the data. We analyze the second-order Perceptron algorithm in the mistake bound model of on-line learning and prove bounds in terms of the eigenvalues of the Gram matrix created from the data. The performance of the second-order Perceptron algorithm is affected by the setting of a parameter controlling the sensitivity to the distribution of the eigenvalues of the Gram matrix. Since this information is not preliminarly available to on-line algorithms, we also design a refined version of the second-order Perceptron algorithm which adaptively sets the value of this parameter. For this second algorithm we are able to prove mistake bounds corresponding to a nearly optimal constant setting of the parameter.
Cite
Text
Cesa-Bianchi et al. "A Second-Order Perceptron Algorithm." Annual Conference on Computational Learning Theory, 2002. doi:10.1007/3-540-45435-7_9Markdown
[Cesa-Bianchi et al. "A Second-Order Perceptron Algorithm." Annual Conference on Computational Learning Theory, 2002.](https://mlanthology.org/colt/2002/cesabianchi2002colt-second/) doi:10.1007/3-540-45435-7_9BibTeX
@inproceedings{cesabianchi2002colt-second,
title = {{A Second-Order Perceptron Algorithm}},
author = {Cesa-Bianchi, Nicolò and Conconi, Alex and Gentile, Claudio},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2002},
pages = {121-137},
doi = {10.1007/3-540-45435-7_9},
url = {https://mlanthology.org/colt/2002/cesabianchi2002colt-second/}
}