Differentiable Particle Filtering via Entropy-Regularized Optimal Transport
Abstract
Particle Filtering (PF) methods are an established class of procedures for performing inference in non-linear state-space models. Resampling is a key ingredient of PF necessary to obtain low variance likelihood and states estimates. However, traditional resampling methods result in PF-based loss functions being non-differentiable with respect to model and PF parameters. In a variational inference context, resampling also yields high variance gradient estimates of the PF-based evidence lower bound. By leveraging optimal transport ideas, we introduce a principled differentiable particle filter and provide convergence results. We demonstrate this novel method on a variety of applications.
Cite
Text
Corenflos et al. "Differentiable Particle Filtering via Entropy-Regularized Optimal Transport." International Conference on Machine Learning, 2021.Markdown
[Corenflos et al. "Differentiable Particle Filtering via Entropy-Regularized Optimal Transport." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/corenflos2021icml-differentiable/)BibTeX
@inproceedings{corenflos2021icml-differentiable,
title = {{Differentiable Particle Filtering via Entropy-Regularized Optimal Transport}},
author = {Corenflos, Adrien and Thornton, James and Deligiannidis, George and Doucet, Arnaud},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {2100-2111},
volume = {139},
url = {https://mlanthology.org/icml/2021/corenflos2021icml-differentiable/}
}