Computing the Bias of Constant-Step Stochastic Approximation with Markovian Noise
Abstract
We study stochastic approximation algorithms with Markovian noise and constant step-size $\alpha$. We develop a method based on infinitesimal generator comparisons to study the bias of the algorithm, which is the expected difference between $\theta_n$ ---the value at iteration $n$--- and $\theta^*$ ---the unique equilibrium of the corresponding ODE. We show that, under some smoothness conditions, this bias is of order $O(\alpha)$. Furthermore, we show that the time-averaged bias is equal to $\alpha V + O(\alpha^2)$, where $V$ is a constant characterized by a Lyapunov equation, showing that $E[\bar{\theta}_n] \approx \theta^*+V\alpha + O(\alpha^2)$, where $\bar{\theta}_n$ is the Polyak-Ruppert average. We also show that $\bar{\theta}_n$ converges with high probability around $\theta^*+\alpha V$. We illustrate how to combine this with Richardson-Romberg extrapolation to derive an iterative scheme with a bias of order $O(\alpha^2)$.
Cite
Text
Allmeier and Gast. "Computing the Bias of Constant-Step Stochastic Approximation with Markovian Noise." Neural Information Processing Systems, 2024. doi:10.52202/079017-4379Markdown
[Allmeier and Gast. "Computing the Bias of Constant-Step Stochastic Approximation with Markovian Noise." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/allmeier2024neurips-computing/) doi:10.52202/079017-4379BibTeX
@inproceedings{allmeier2024neurips-computing,
title = {{Computing the Bias of Constant-Step Stochastic Approximation with Markovian Noise}},
author = {Allmeier, Sebastian and Gast, Nicolas},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-4379},
url = {https://mlanthology.org/neurips/2024/allmeier2024neurips-computing/}
}