Learning Linear Dynamical Systems with Semi-Parametric Least Squares
Abstract
We analyze a simple prefiltered variation of the least squares estimator for the problem of estimation with biased, \emph{semi-parametric} noise, an error model studied more broadly in causal statistics and active learning. We prove an oracle inequality which demonstrates that this procedure provably mitigates the variance introduced by long-term dependencies. % We then demonstrate that prefiltered least squares yields, to our knowledge, the first algorithm that provably estimates the parameters of partially-observed linear systems that attains rates which do not not incur a worst-case dependence on the rate at which these dependencies decay. % The algorithm is provably consistent even for systems which satisfy the weaker \emph{marginal stability} condition obeyed by many classical models based on Newtonian mechanics. In this context, our semi-parametric framework yields guarantees for both stochastic and worst-case noise.
Cite
Text
Simchowitz et al. "Learning Linear Dynamical Systems with Semi-Parametric Least Squares." Conference on Learning Theory, 2019.Markdown
[Simchowitz et al. "Learning Linear Dynamical Systems with Semi-Parametric Least Squares." Conference on Learning Theory, 2019.](https://mlanthology.org/colt/2019/simchowitz2019colt-learning/)BibTeX
@inproceedings{simchowitz2019colt-learning,
title = {{Learning Linear Dynamical Systems with Semi-Parametric Least Squares}},
author = {Simchowitz, Max and Boczar, Ross and Recht, Benjamin},
booktitle = {Conference on Learning Theory},
year = {2019},
pages = {2714-2802},
volume = {99},
url = {https://mlanthology.org/colt/2019/simchowitz2019colt-learning/}
}