Statistical Inference for Linear Stochastic Approximation with Markovian Noise

Abstract

In this paper we derive non-asymptotic Berry–Esseen bounds for Polyak–Ruppert averaged iterates of the Linear Stochastic Approximation (LSA) algorithm driven by the Markovian noise. Our analysis yields $O(n^{-1/4})$ convergence rates to the Gaussian limit in the Kolmogorov distance. We further establish the non-asymptotic validity of a multiplier block bootstrap procedure for constructing the confidence intervals, guaranteeing consistent inference under Markovian sampling. Our work provides the first non-asymptotic guarantees on the rate of convergence of bootstrap-based confidence intervals for stochastic approximation with Markov noise. Moreover, we recover the classical rate of order $\mathcal{O}(n^{-1/8})$ up to logarithmic factors for estimating the asymptotic variance of the iterates of the LSA algorithm.

Cite

Text

Samsonov et al. "Statistical Inference for Linear Stochastic Approximation with Markovian Noise." Advances in Neural Information Processing Systems, 2025.

Markdown

[Samsonov et al. "Statistical Inference for Linear Stochastic Approximation with Markovian Noise." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/samsonov2025neurips-statistical/)

BibTeX

@inproceedings{samsonov2025neurips-statistical,
  title     = {{Statistical Inference for Linear Stochastic Approximation with Markovian Noise}},
  author    = {Samsonov, Sergey and Sheshukova, Marina and Moulines, Eric and Naumov, Alexey},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/samsonov2025neurips-statistical/}
}