Stochastic Finite Learning of the Pattern Languages
Abstract
The present paper proposes a new learning model—called stochastic finite learning —and shows the whole class of pattern languages to be learnable within this model. This main result is achieved by providing a new and improved average-case analysis of the Lange–Wiehagen ( New Generation Computing , 8 , 361–370) algorithm learning the class of all pattern languages in the limit from positive data. The complexity measure chosen is the total learning time , i.e., the overall time taken by the algorithm until convergence . The expectation of the total learning time is carefully analyzed and exponentially shrinking tail bounds for it are established for a large class of probability distributions. For every pattern π containing k different variables it is shown that Lange and Wiehagen's algorithm possesses an expected total learning time of \(O(\hat \alpha ^k E[\Lambda ]\log _{1/\beta } (k))\) , where \({\hat \alpha }\) and β are two easily computable parameters arising naturally from the underlying probability distributions, and E [Λ] is the expected example string length. Finally, assuming a bit of domain knowledge concerning the underlying class of probability distributions, it is shown how to convert learning in the limit into stochastic finite learning .
Cite
Text
Rossmanith and Zeugmann. "Stochastic Finite Learning of the Pattern Languages." Machine Learning, 2001. doi:10.1023/A:1010875913047Markdown
[Rossmanith and Zeugmann. "Stochastic Finite Learning of the Pattern Languages." Machine Learning, 2001.](https://mlanthology.org/mlj/2001/rossmanith2001mlj-stochastic/) doi:10.1023/A:1010875913047BibTeX
@article{rossmanith2001mlj-stochastic,
title = {{Stochastic Finite Learning of the Pattern Languages}},
author = {Rossmanith, Peter and Zeugmann, Thomas},
journal = {Machine Learning},
year = {2001},
pages = {67-91},
doi = {10.1023/A:1010875913047},
volume = {44},
url = {https://mlanthology.org/mlj/2001/rossmanith2001mlj-stochastic/}
}