On the Computational Power of Online Gradient Descent

Abstract

We prove that the evolution of weight vectors in online gradient descent can encode arbitrary polynomial-space computations, even in very simple learning settings. Our results imply that, under weak complexity-theoretic assumptions, it is impossible to reason efficiently about the fine-grained behavior of online gradient descent.

Cite

Text

Chatziafratis et al. "On the Computational Power of Online Gradient Descent." Conference on Learning Theory, 2019.

Markdown

[Chatziafratis et al. "On the Computational Power of Online Gradient Descent." Conference on Learning Theory, 2019.](https://mlanthology.org/colt/2019/chatziafratis2019colt-computational/)

BibTeX

@inproceedings{chatziafratis2019colt-computational,
  title     = {{On the Computational Power of Online Gradient Descent}},
  author    = {Chatziafratis, Vaggos and Roughgarden, Tim and Wang, Joshua R.},
  booktitle = {Conference on Learning Theory},
  year      = {2019},
  pages     = {624-662},
  volume    = {99},
  url       = {https://mlanthology.org/colt/2019/chatziafratis2019colt-computational/}
}