Learning Is a Kan Extension

Abstract

Previous work has demonstrated that efficient algorithms exist for computing Kan extensions and that some Kan extensions have interesting similarities to various machine learning algorithms. This paper closes the gap by proving that all error minimisation algorithms may be presented as a Kan extension. This result provides a foundation for future work to investigate the optimisation of machine learning algorithms through their presentation as Kan extensions. A corollary of this representation of error-minimising algorithms is a presentation of error from the perspective of lossy and lossless transformations of data.

Cite

Text

Pugh et al. "Learning Is a Kan Extension." Transactions on Machine Learning Research, 2025.

Markdown

[Pugh et al. "Learning Is a Kan Extension." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/pugh2025tmlr-learning/)

BibTeX

@article{pugh2025tmlr-learning,
  title     = {{Learning Is a Kan Extension}},
  author    = {Pugh, Matthew and Harris, Nick and Cirstea, Corina and Grundy, Jo},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/pugh2025tmlr-learning/}
}