The Power of Selective Memory: Self-Bounded Learning of Prediction Suffix Trees

Abstract

Prediction suffix trees (PST) provide a popular and effective tool for tasks such as compression, classification, and language modeling. In this pa- per we take a decision theoretic view of PSTs for the task of sequence prediction. Generalizing the notion of margin to PSTs, we present an on- line PST learning algorithm and derive a loss bound for it. The depth of the PST generated by this algorithm scales linearly with the length of the input. We then describe a self-bounded enhancement of our learning al- gorithm which automatically grows a bounded-depth PST. We also prove an analogous mistake-bound for the self-bounded algorithm. The result is an efficient algorithm that neither relies on a-priori assumptions on the shape or maximal depth of the target PST nor does it require any param- eters. To our knowledge, this is the first provably-correct PST learning algorithm which generates a bounded-depth PST while being competi- tive with any fixed PST determined in hindsight.

Cite

Text

Dekel et al. "The Power of Selective Memory: Self-Bounded Learning of Prediction Suffix Trees." Neural Information Processing Systems, 2004.

Markdown

[Dekel et al. "The Power of Selective Memory: Self-Bounded Learning of Prediction Suffix Trees." Neural Information Processing Systems, 2004.](https://mlanthology.org/neurips/2004/dekel2004neurips-power/)

BibTeX

@inproceedings{dekel2004neurips-power,
  title     = {{The Power of Selective Memory: Self-Bounded Learning of Prediction Suffix Trees}},
  author    = {Dekel, Ofer and Shalev-shwartz, Shai and Singer, Yoram},
  booktitle = {Neural Information Processing Systems},
  year      = {2004},
  pages     = {345-352},
  url       = {https://mlanthology.org/neurips/2004/dekel2004neurips-power/}
}