IHT Dies Hard: Provable Accelerated Iterative Hard Thresholding
Abstract
We study --both in theory and practice-- the use of momentum motions in classic iterative hard thresholding (IHT) methods. By simply modifying plain IHT, we investigate its convergence behavior on convex optimization criteria with non-convex constraints, under standard assumptions. In diverse scenaria, we observe that acceleration in IHT leads to significant improvements, compared to state of the art projected gradient descent and Frank-Wolfe variants. As a byproduct of our inspection, we study the impact of selecting the momentum parameter: similar to convex settings, two modes of behavior are observed --"rippling" and linear-- depending on the level of momentum.
Cite
Text
Khanna and Kyrillidis. "IHT Dies Hard: Provable Accelerated Iterative Hard Thresholding." International Conference on Artificial Intelligence and Statistics, 2018.Markdown
[Khanna and Kyrillidis. "IHT Dies Hard: Provable Accelerated Iterative Hard Thresholding." International Conference on Artificial Intelligence and Statistics, 2018.](https://mlanthology.org/aistats/2018/khanna2018aistats-iht/)BibTeX
@inproceedings{khanna2018aistats-iht,
title = {{IHT Dies Hard: Provable Accelerated Iterative Hard Thresholding}},
author = {Khanna, Rajiv and Kyrillidis, Anastasios},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2018},
pages = {188-198},
url = {https://mlanthology.org/aistats/2018/khanna2018aistats-iht/}
}