On Schrödinger Bridge Matching and Expectation Maximization

Abstract

In this work, we analyze methods for solving the Schrödinger Bridge problem from the perspective of alternating KL divergence minimization. While existing methods such as Iterative Proportional- or Markovian- Fitting require exact updates due to each iteration optimizing the same argument in the \kl divergence, we justify a joint optimization of a single KL divergence objective from the perspective of information geometry. As in the variational EM algorithm, this allows for partial, stochastic gradient updates to decrease a unified objective. We highlight connections with related bridge-matching, flow-matching, and few-step generative modeling approaches, where various parameterizations of the coupling distributions are contextualized from the perspective of marginal-preserving inference.

Cite

Text

Brekelmans and Neklyudov. "On Schrödinger Bridge Matching and Expectation Maximization." NeurIPS 2023 Workshops: OTML, 2023.

Markdown

[Brekelmans and Neklyudov. "On Schrödinger Bridge Matching and Expectation Maximization." NeurIPS 2023 Workshops: OTML, 2023.](https://mlanthology.org/neuripsw/2023/brekelmans2023neuripsw-schrodinger/)

BibTeX

@inproceedings{brekelmans2023neuripsw-schrodinger,
  title     = {{On Schrödinger Bridge Matching and Expectation Maximization}},
  author    = {Brekelmans, Rob and Neklyudov, Kirill},
  booktitle = {NeurIPS 2023 Workshops: OTML},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/brekelmans2023neuripsw-schrodinger/}
}