Implicit Diffusion: Efficient Optimization Through Stochastic Sampling

Abstract

Sampling and automatic differentiation are both ubiquitous in modern machine learning. At its intersection, differentiating through a sampling operation, with respect to the parameters of the sampling process, is a problem that is both challenging and broadly applicable. We introduce a general framework and a new algorithm for first-order optimization of parameterized stochastic diffusions, performing jointly, in a single loop, optimization and sampling steps. This approach is inspired by recent advances in bilevel optimization and automatic implicit differentiation, leveraging the point of view of sampling as optimization over the space of probability distributions. We provide theoretical and experimental results showcasing the performance of our method.

Cite

Text

Marion et al. "Implicit Diffusion: Efficient Optimization Through Stochastic Sampling." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.

Markdown

[Marion et al. "Implicit Diffusion: Efficient Optimization Through Stochastic Sampling." Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, 2025.](https://mlanthology.org/aistats/2025/marion2025aistats-implicit/)

BibTeX

@inproceedings{marion2025aistats-implicit,
  title     = {{Implicit Diffusion: Efficient Optimization Through Stochastic Sampling}},
  author    = {Marion, Pierre and Korba, Anna and Bartlett, Peter and Blondel, Mathieu and De Bortoli, Valentin and Doucet, Arnaud and Llinares-López, Felipe and Paquette, Courtney and Berthet, Quentin},
  booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics},
  year      = {2025},
  pages     = {1999-2007},
  volume    = {258},
  url       = {https://mlanthology.org/aistats/2025/marion2025aistats-implicit/}
}