Particle Filter Recurrent Neural Networks

Abstract

Recurrent neural networks (RNNs) have been extraordinarily successful for prediction with sequential data. To tackle highly variable and multi-modal real-world data, we introduce Particle Filter Recurrent Neural Networks (PF-RNNs), a new RNN family that explicitly models uncertainty in its internal structure: while an RNN relies on a long, deterministic latent state vector, a PF-RNN maintains a latent state distribution, approximated as a set of particles. For effective learning, we provide a fully differentiable particle filter algorithm that updates the PF-RNN latent state distribution according to the Bayes rule. Experiments demonstrate that the proposed PF-RNNs outperform the corresponding standard gated RNNs on a synthetic robot localization dataset and 10 real-world sequence prediction datasets for text classification, stock price prediction, etc.

Cite

Text

Ma et al. "Particle Filter Recurrent Neural Networks." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I04.5952

Markdown

[Ma et al. "Particle Filter Recurrent Neural Networks." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/ma2020aaai-particle/) doi:10.1609/AAAI.V34I04.5952

BibTeX

@inproceedings{ma2020aaai-particle,
  title     = {{Particle Filter Recurrent Neural Networks}},
  author    = {Ma, Xiao and Karkus, Péter and Hsu, David and Lee, Wee Sun},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {5101-5108},
  doi       = {10.1609/AAAI.V34I04.5952},
  url       = {https://mlanthology.org/aaai/2020/ma2020aaai-particle/}
}