Learning Without Mixing: Towards a Sharp Analysis of Linear System Identification

Abstract

We prove that the ordinary least-squares (OLS) estimator attains nearly minimax optimal performance for the identification of linear dynamical systems from a single observed trajectory. Our upper bound relies on a generalization of Mendelson's small-ball method to dependent data, eschewing the use of standard mixing-time arguments. Our lower bounds reveal that these upper bounds match up to logarithmic factors. In particular, we capture the correct signal-to-noise behavior of the problem, showing that more unstable linear systems are easier to estimate. This behavior is qualitatively different from arguments which rely on mixing-time calculations that suggest that unstable systems are more difficult to estimate. We generalize our technique to provide bounds for a more general class of linear response time-series.

Cite

Text

Simchowitz et al. "Learning Without Mixing: Towards a Sharp Analysis of Linear System Identification." Annual Conference on Computational Learning Theory, 2018.

Markdown

[Simchowitz et al. "Learning Without Mixing: Towards a Sharp Analysis of Linear System Identification." Annual Conference on Computational Learning Theory, 2018.](https://mlanthology.org/colt/2018/simchowitz2018colt-learning/)

BibTeX

@inproceedings{simchowitz2018colt-learning,
  title     = {{Learning Without Mixing: Towards a Sharp Analysis of Linear System Identification}},
  author    = {Simchowitz, Max and Mania, Horia and Tu, Stephen and Jordan, Michael I. and Recht, Benjamin},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2018},
  pages     = {439-473},
  url       = {https://mlanthology.org/colt/2018/simchowitz2018colt-learning/}
}