Online Infinite-Dimensional Regression: Learning Linear Operators

Abstract

We consider the problem of learning linear operators under squared loss between two infinite-dimensional Hilbert spaces in the online setting. We show that the class of linear operators with uniformly bounded $p$-Schatten norm is online learnable for any $p \in [1, \infty)$. On the other hand, we prove an impossibility result by showing that the class of uniformly bounded linear operators with respect to the operator norm is \textit{not} online learnable. Moreover, we show a separation between sequential uniform convergence and online learnability by identifying a class of bounded linear operators that is online learnable but uniform convergence does not hold. Finally, we prove that the impossibility result and the separation between uniform convergence and learnability also hold in the batch setting.

Cite

Text

Subedi et al. "Online Infinite-Dimensional Regression: Learning Linear Operators." Proceedings of The 35th International Conference on Algorithmic Learning Theory, 2024.

Markdown

[Subedi et al. "Online Infinite-Dimensional Regression: Learning Linear Operators." Proceedings of The 35th International Conference on Algorithmic Learning Theory, 2024.](https://mlanthology.org/alt/2024/subedi2024alt-online/)

BibTeX

@inproceedings{subedi2024alt-online,
  title     = {{Online Infinite-Dimensional Regression: Learning Linear Operators}},
  author    = {Subedi, Unique and Raman, Vinod and Tewari, Ambuj},
  booktitle = {Proceedings of The 35th International Conference on Algorithmic Learning Theory},
  year      = {2024},
  pages     = {1113-1133},
  volume    = {237},
  url       = {https://mlanthology.org/alt/2024/subedi2024alt-online/}
}