HOUDINI: Escaping from Moderately Constrained Saddles
Abstract
We give polynomial time algorithms for escaping from high-dimensional saddle points under a moderate number of constraints. Given gradient access to a smooth function, we show that (noisy) gradient descent methods can escape from saddle points under a logarithmic number of inequality constraints. While analogous results exist for unconstrained and equality-constrained problems, we make progress on the major open question of convergence to second-order stationary points in the case of inequality constraints, without reliance on NP-oracles or altering the definitions to only account for certain constraints. Our results hold for both regular and stochastic gradient descent.
Cite
Text
Avdiukhin and Yaroslavtsev. "HOUDINI: Escaping from Moderately Constrained Saddles." International Joint Conference on Artificial Intelligence, 2023. doi:10.24963/IJCAI.2023/383Markdown
[Avdiukhin and Yaroslavtsev. "HOUDINI: Escaping from Moderately Constrained Saddles." International Joint Conference on Artificial Intelligence, 2023.](https://mlanthology.org/ijcai/2023/avdiukhin2023ijcai-houdini/) doi:10.24963/IJCAI.2023/383BibTeX
@inproceedings{avdiukhin2023ijcai-houdini,
title = {{HOUDINI: Escaping from Moderately Constrained Saddles}},
author = {Avdiukhin, Dmitrii and Yaroslavtsev, Grigory},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2023},
pages = {3442-3450},
doi = {10.24963/IJCAI.2023/383},
url = {https://mlanthology.org/ijcai/2023/avdiukhin2023ijcai-houdini/}
}