Iterative Linearized Control: Stable Algorithms and Complexity Guarantees
Abstract
We examine popular gradient-based algorithms for nonlinear control in the light of the modern complexity analysis of first-order optimization algorithms. The examination reveals that the complexity bounds can be clearly stated in terms of calls to a computational oracle related to dynamic programming and implementable by gradient back-propagation using machine learning software libraries such as PyTorch or TensorFlow. Finally, we propose a regularized Gauss-Newton algorithm enjoying worst-case complexity bounds and improved convergence behavior in practice. The software library based on PyTorch is publicly available.
Cite
Text
Roulet et al. "Iterative Linearized Control: Stable Algorithms and Complexity Guarantees." International Conference on Machine Learning, 2019.Markdown
[Roulet et al. "Iterative Linearized Control: Stable Algorithms and Complexity Guarantees." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/roulet2019icml-iterative/)BibTeX
@inproceedings{roulet2019icml-iterative,
title = {{Iterative Linearized Control: Stable Algorithms and Complexity Guarantees}},
author = {Roulet, Vincent and Srinivasa, Siddhartha and Drusvyatskiy, Dmitriy and Harchaoui, Zaid},
booktitle = {International Conference on Machine Learning},
year = {2019},
pages = {5518-5527},
volume = {97},
url = {https://mlanthology.org/icml/2019/roulet2019icml-iterative/}
}