Generalised Entropy and Asymptotic Complexities of Languages

Abstract

In this paper the concept of asymptotic complexity of languages is introduced. This concept formalises the notion of learnability in a particular environment and generalises Lutz and Fortnow’s concepts of predictability and dimension. Then asymptotic complexities in different prediction environments are compared by describing the set of all pairs of asymptotic complexities w.r.t. different environments. A geometric characterisation in terms of generalised entropies is obtained and thus the results of Lutz and Fortnow are generalised.

Cite

Text

Kalnishkan et al. "Generalised Entropy and Asymptotic Complexities of Languages." Annual Conference on Computational Learning Theory, 2007. doi:10.1007/978-3-540-72927-3_22

Markdown

[Kalnishkan et al. "Generalised Entropy and Asymptotic Complexities of Languages." Annual Conference on Computational Learning Theory, 2007.](https://mlanthology.org/colt/2007/kalnishkan2007colt-generalised/) doi:10.1007/978-3-540-72927-3_22

BibTeX

@inproceedings{kalnishkan2007colt-generalised,
  title     = {{Generalised Entropy and Asymptotic Complexities of Languages}},
  author    = {Kalnishkan, Yuri and Vovk, Vladimir and Vyugin, Michael V.},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2007},
  pages     = {293-307},
  doi       = {10.1007/978-3-540-72927-3_22},
  url       = {https://mlanthology.org/colt/2007/kalnishkan2007colt-generalised/}
}