A Lower Bound for Learning Distributions Generated by Probabilistic Automata
Abstract
Known algorithms for learning PDFA can only be shown to run in time polynomial in the so-called distinguishability µ of the target machine, besides the number of states and the usual accuracy and confidence parameters. We show that the dependence on µ is necessary for every algorithm whose structure resembles existing ones. As a technical tool, a new variant of Statistical Queries termed L∞-queries is defined. We show how these queries can be simulated from samples and observe that known PAC algorithms for learning PDFA can be rewritten to access its target using L∞-queries and standard Statistical Queries. Finally, we show a lower bound: every algorithm to learn PDFA using queries with a resonable tolerance needs a number of queries larger than (1/µ)c for every c < 1.
Cite
Text
Balle et al. "A Lower Bound for Learning Distributions Generated by Probabilistic Automata." International Conference on Algorithmic Learning Theory, 2010. doi:10.1007/978-3-642-16108-7_17Markdown
[Balle et al. "A Lower Bound for Learning Distributions Generated by Probabilistic Automata." International Conference on Algorithmic Learning Theory, 2010.](https://mlanthology.org/alt/2010/balle2010alt-lower/) doi:10.1007/978-3-642-16108-7_17BibTeX
@inproceedings{balle2010alt-lower,
title = {{A Lower Bound for Learning Distributions Generated by Probabilistic Automata}},
author = {Balle, Borja and Castro, Jorge and Gavaldà, Ricard},
booktitle = {International Conference on Algorithmic Learning Theory},
year = {2010},
pages = {179-193},
doi = {10.1007/978-3-642-16108-7_17},
url = {https://mlanthology.org/alt/2010/balle2010alt-lower/}
}