Open Problem: Online Local Learning

Abstract

In many learning problems, we attempt to infer \emphglobal structure in the interest of making \emphlocal predictions. For example, we might try to infer the skills of the competitors in a tournament in order to predict who will win a match, or we might try to predict characteristics of users and films in order to predict which users will like which films. In even relatively simple settings of this type, it is typically NP-hard to find the latent data which best explain some observations. But do these complexity-theoretic obstructions actually prevent us from making good predictions? Because each prediction depends on only a small number of variables, it might be possible to make good predictions without actually finding a good global assignment. This may seem to be a purely technical distinction, but recent work has shown that several local prediction problems actually \emphare easy even though the corresponding global inference problem is hard. The question we pose is: how general is this phenomenon?

Cite

Text

Christiano. "Open Problem: Online Local Learning." Annual Conference on Computational Learning Theory, 2014.

Markdown

[Christiano. "Open Problem: Online Local Learning." Annual Conference on Computational Learning Theory, 2014.](https://mlanthology.org/colt/2014/christiano2014colt-open/)

BibTeX

@inproceedings{christiano2014colt-open,
  title     = {{Open Problem: Online Local Learning}},
  author    = {Christiano, Paul F.},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2014},
  pages     = {1290-1294},
  url       = {https://mlanthology.org/colt/2014/christiano2014colt-open/}
}