The Adaptive Complexity of Minimizing Relative Fisher Information

Abstract

Non-log-concave sampling from an unnormalized density is fundamental in machine learning and statistics. As datasets grow larger, computational efficiency becomes increasingly important, particularly in reducing adaptive complexity, namely the number of sequential rounds required for sampling algorithms. In this work, we initiate the study of the adaptive complexity of non-log-concave sampling within the framework of relative Fisher information introduced by Balasubramanian et al. in 2022. To obtain a relative fisher information of at most $\varepsilon^2$ from the target distribution, we propose a novel algorithm that reduces the adaptive complexity from $\mathcal{O}(d^2/\varepsilon^4)$ to $\mathcal{O}(d/\varepsilon^2)$ by leveraging parallelism. Furthermore, we show our algorithm is optimal for a specific regime of large $\varepsilon$. Our algorithm builds on a diagonally parallelized Picard iteration, while the lower bound is based on a reduction from the problem of finding stationary points.

Cite

Text

Zhou and Sugiyama. "The Adaptive Complexity of Minimizing Relative Fisher Information." Advances in Neural Information Processing Systems, 2025.

Markdown

[Zhou and Sugiyama. "The Adaptive Complexity of Minimizing Relative Fisher Information." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/zhou2025neurips-adaptive/)

BibTeX

@inproceedings{zhou2025neurips-adaptive,
  title     = {{The Adaptive Complexity of Minimizing Relative Fisher Information}},
  author    = {Zhou, Huanjian and Sugiyama, Masashi},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/zhou2025neurips-adaptive/}
}