Prediction and Dimension
Abstract
Given a set X of sequences over a finite alphabet, we investigate the following three quantities. (i) The feasible predictability of X is the highest success ratio that a polynomial-time randomized predictor can achieve on all sequences in X . (ii) The deterministic feasible predictability of X is the highest success ratio that a polynomial-time deterministic predictor can achieve on all sequences in X . (iii) The feasible dimension of X is the polynomial-time effectivization of the classical Hausdorff dimension (“fractal dimension”) of X . Predictability is known to be stable in the sense that the feasible predictability of X ∪ Y is always the minimum of the feasible predictabilities of X and Y . We show that deterministic predictability also has this property if X and Y are computably presentable. We show that deterministic predictability coincides with predictability on singleton sets. Our main theorem states that the feasible dimension of X is bounded above by the maximum entropy of the predictability of X and bounded below by the segmented self-information of the predictability of X , and that these bounds are tight.
Cite
Text
Fortnow and Lutz. "Prediction and Dimension." Annual Conference on Computational Learning Theory, 2002. doi:10.1007/3-540-45435-7_26Markdown
[Fortnow and Lutz. "Prediction and Dimension." Annual Conference on Computational Learning Theory, 2002.](https://mlanthology.org/colt/2002/fortnow2002colt-prediction/) doi:10.1007/3-540-45435-7_26BibTeX
@inproceedings{fortnow2002colt-prediction,
title = {{Prediction and Dimension}},
author = {Fortnow, Lance and Lutz, Jack H.},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2002},
pages = {380-395},
doi = {10.1007/3-540-45435-7_26},
url = {https://mlanthology.org/colt/2002/fortnow2002colt-prediction/}
}