Computable Learning of Natural Hypothesis Classes
Abstract
This paper is about the recent notion of computably probably approximately correct learning, which lies between the statistical learning theory where there is no computational requirement on the learner and efficient PAC learning where the learner must be polynomially bounded. Examples have recently been given of hypothesis classes which are PAC-learnable but not computably PAC-learnable, but these hypothesis classes can be viewed as unnatural or non-canonical. We use the on-a-cone machinery from computability theory to prove that, under certain assumptions on the hypothesis class, any “natural” hypothesis class which is learnable must be computably learnable.
Cite
Text
Akbari and Harrison-Trainor. "Computable Learning of Natural Hypothesis Classes." Proceedings of Thirty Eighth Conference on Learning Theory, 2025.Markdown
[Akbari and Harrison-Trainor. "Computable Learning of Natural Hypothesis Classes." Proceedings of Thirty Eighth Conference on Learning Theory, 2025.](https://mlanthology.org/colt/2025/akbari2025colt-computable/)BibTeX
@inproceedings{akbari2025colt-computable,
title = {{Computable Learning of Natural Hypothesis Classes}},
author = {Akbari, Syed and Harrison-Trainor, Matthew},
booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory},
year = {2025},
pages = {2-21},
volume = {291},
url = {https://mlanthology.org/colt/2025/akbari2025colt-computable/}
}