Statistical Inference Using SGD
Abstract
We present a novel method for frequentist statistical inference in M-estimation problems, based on stochastic gradient descent (SGD) with a fixed step size: we demonstrate that the average of such SGD sequences can be used for statistical inference, after proper scaling. An intuitive analysis using the Ornstein-Uhlenbeck process suggests that such averages are asymptotically normal. To show the merits of our scheme, we apply it to both synthetic and real data sets, and demonstrate that its accuracy is comparable to classical statistical methods, while requiring potentially far less computation.
Cite
Text
Li et al. "Statistical Inference Using SGD." AAAI Conference on Artificial Intelligence, 2018. doi:10.1609/AAAI.V32I1.11686Markdown
[Li et al. "Statistical Inference Using SGD." AAAI Conference on Artificial Intelligence, 2018.](https://mlanthology.org/aaai/2018/li2018aaai-statistical/) doi:10.1609/AAAI.V32I1.11686BibTeX
@inproceedings{li2018aaai-statistical,
title = {{Statistical Inference Using SGD}},
author = {Li, Tianyang and Liu, Liu and Kyrillidis, Anastasios and Caramanis, Constantine},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2018},
pages = {3571-3578},
doi = {10.1609/AAAI.V32I1.11686},
url = {https://mlanthology.org/aaai/2018/li2018aaai-statistical/}
}