Learning and Generalization with Minimerror, a Temperature-Dependent Learning Algorithm
Abstract
We study the numerical performances of Minimerror, a recently introduced learning algorithm for the perceptron that has analytically been shown to be optimal both on learning linearly and nonlinearly separable functions. We present its implementation on learning linearly separable boolean functions. Numerical results are in excellent agreement with the theoretical predictions.
Cite
Text
Raffin and Gordon. "Learning and Generalization with Minimerror, a Temperature-Dependent Learning Algorithm." Neural Computation, 1995. doi:10.1162/NECO.1995.7.6.1206Markdown
[Raffin and Gordon. "Learning and Generalization with Minimerror, a Temperature-Dependent Learning Algorithm." Neural Computation, 1995.](https://mlanthology.org/neco/1995/raffin1995neco-learning/) doi:10.1162/NECO.1995.7.6.1206BibTeX
@article{raffin1995neco-learning,
title = {{Learning and Generalization with Minimerror, a Temperature-Dependent Learning Algorithm}},
author = {Raffin, Bruno and Gordon, Mirta B.},
journal = {Neural Computation},
year = {1995},
pages = {1206-1224},
doi = {10.1162/NECO.1995.7.6.1206},
volume = {7},
url = {https://mlanthology.org/neco/1995/raffin1995neco-learning/}
}