Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences
Abstract
Solomonoff's uncomputable universal prediction scheme ξ allows to predict the next symbol xk of a sequence x1...xk-1 for any Turing computable, but otherwise unknown, probabilistic environment µ. This scheme will be generalized to arbitrary environmental classes, which, among others, allows the construction of computable universal prediction schemes ξ. Convergence of ξ to µ in a conditional mean squared sense and with µ probability 1 is proven. It is shown that the average number of prediction errors made by the universal ξ scheme rapidly converges to those made by the best possible informed µ scheme. The schemes, theorems and proofs are given for general finite alphabet, which results in additional complications as compared to the binary case. Several extensions of the presented theory and results are outlined. They include general loss functions and bounds, games of chance, infinite alphabet, partial and delayed prediction, classification, and more active systems.
Cite
Text
Hutter. "Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences." European Conference on Machine Learning, 2001. doi:10.1007/3-540-44795-4_21Markdown
[Hutter. "Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences." European Conference on Machine Learning, 2001.](https://mlanthology.org/ecmlpkdd/2001/hutter2001ecml-convergence/) doi:10.1007/3-540-44795-4_21BibTeX
@inproceedings{hutter2001ecml-convergence,
title = {{Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences}},
author = {Hutter, Marcus},
booktitle = {European Conference on Machine Learning},
year = {2001},
pages = {239-250},
doi = {10.1007/3-540-44795-4_21},
url = {https://mlanthology.org/ecmlpkdd/2001/hutter2001ecml-convergence/}
}