Open Problem: Can Local Regularization Learn All Multiclass Problems?

Abstract

Multiclass classification is the simple generalization of binary classification to arbitrary label sets. Despite its simplicity, it has been remarkably resistant to study: a characterization of multiclass learnability was established only two years ago by Brukhim et al. 2022, and the understanding of optimal learners for multiclass problems remains fairly limited. We ask whether there exists a simple algorithmic template — akin to empirical risk minimization (ERM) for binary classification — which characterizes multiclass learning. Namely, we ask whether local regularization, introduced by Asilis et al. 2024, is sufficiently expressive to learn all multiclass problems possible. Towards (negatively) resolving the problem, we propose a hypothesis class which may not be learnable by any such local regularizer.

Cite

Text

Asilis et al. "Open Problem: Can Local Regularization Learn All Multiclass Problems?." Conference on Learning Theory, 2024.

Markdown

[Asilis et al. "Open Problem: Can Local Regularization Learn All Multiclass Problems?." Conference on Learning Theory, 2024.](https://mlanthology.org/colt/2024/asilis2024colt-open/)

BibTeX

@inproceedings{asilis2024colt-open,
  title     = {{Open Problem: Can Local Regularization Learn All Multiclass Problems?}},
  author    = {Asilis, Julian and Devic, Siddartha and Dughmi, Shaddin and Sharan, Vatsal and Teng, Shang-Hua},
  booktitle = {Conference on Learning Theory},
  year      = {2024},
  pages     = {5301-5305},
  volume    = {247},
  url       = {https://mlanthology.org/colt/2024/asilis2024colt-open/}
}