Robust Descent Using Smoothed Multiplicative Noise
Abstract
In this work, we propose a novel robust gradient descent procedure which makes use of a smoothed multiplicative noise applied directly to observations before constructing a sum of soft-truncated gradient coordinates. We show that the procedure has competitive theoretical guarantees, with the major advantage of a simple implementation that does not require an iterative sub-routine for robustification. Empirical tests reinforce the theory, showing more efficient generalization over a much wider class of data distributions.
Cite
Text
Holland. "Robust Descent Using Smoothed Multiplicative Noise." Artificial Intelligence and Statistics, 2019.Markdown
[Holland. "Robust Descent Using Smoothed Multiplicative Noise." Artificial Intelligence and Statistics, 2019.](https://mlanthology.org/aistats/2019/holland2019aistats-robust/)BibTeX
@inproceedings{holland2019aistats-robust,
title = {{Robust Descent Using Smoothed Multiplicative Noise}},
author = {Holland, Matthew J.},
booktitle = {Artificial Intelligence and Statistics},
year = {2019},
pages = {703-711},
volume = {89},
url = {https://mlanthology.org/aistats/2019/holland2019aistats-robust/}
}