Open Problem: Direct Sums in Learning Theory

Abstract

In computer science, the term ’direct sum’ refers to fundamental questions about the scaling of computational or information complexity with respect to multiple task instances. Consider an algorithmic task \(T \)and a computational resource \(C \). For instance, \(T \)might be the task of computing a polynomial, with \(C \)representing the number of arithmetic operations required, or \(T \)could be a learning task with its sample complexity as \(C \). The direct sum inquiry focuses on the cost of solving \(k \)separate instances of \(T \), particularly how this aggregate cost compares to the resources needed for a single instance. Typically, the cost for multiple instances is at most \(k \)times the cost of one, since each can be handled independently. However, there are intriguing scenarios where the total cost for \(k \)instances is less than this linear relationship. Such questions naturally extend to the machine-learning setting in which one may be interested in solving several learning problems at once. This notion of direct sums of learning problems gives rise to various natural questions and interesting problems

Cite

Text

Hanneke et al. "Open Problem: Direct Sums in Learning Theory." Conference on Learning Theory, 2024.

Markdown

[Hanneke et al. "Open Problem: Direct Sums in Learning Theory." Conference on Learning Theory, 2024.](https://mlanthology.org/colt/2024/hanneke2024colt-open/)

BibTeX

@inproceedings{hanneke2024colt-open,
  title     = {{Open Problem: Direct Sums in Learning Theory}},
  author    = {Hanneke, Steve and Moran, Shay and Tom, Waknine},
  booktitle = {Conference on Learning Theory},
  year      = {2024},
  pages     = {5325-5329},
  volume    = {247},
  url       = {https://mlanthology.org/colt/2024/hanneke2024colt-open/}
}