The Kernel Kalman Rule - Efficient Nonparametric Inference with Recursive Least Squares

Abstract

Nonparametric inference techniques provide promising tools for probabilistic reasoning in high-dimensional nonlinear systems.Most of these techniques embed distributions into reproducing kernel Hilbert spaces (RKHS) and rely on the kernel Bayes' rule (KBR) to manipulate the embeddings. However, the computational demands of the KBR scale poorly with the number of samples and the KBR often suffers from numerical instabilities. In this paper, we present the kernel Kalman rule (KKR) as an alternative to the KBR.The derivation of the KKR is based on recursive least squares, inspired by the derivation of the Kalman innovation update.We apply the KKR to filtering tasks where we use RKHS embeddings to represent the belief state, resulting in the kernel Kalman filter (KKF).We show on a nonlinear state estimation task with high dimensional observations that our approach provides a significantly improved estimation accuracy while the computational demands are significantly decreased.

Cite

Text

Gebhardt et al. "The Kernel Kalman Rule - Efficient Nonparametric Inference with Recursive Least Squares." AAAI Conference on Artificial Intelligence, 2017. doi:10.1609/AAAI.V31I1.11051

Markdown

[Gebhardt et al. "The Kernel Kalman Rule - Efficient Nonparametric Inference with Recursive Least Squares." AAAI Conference on Artificial Intelligence, 2017.](https://mlanthology.org/aaai/2017/gebhardt2017aaai-kernel/) doi:10.1609/AAAI.V31I1.11051

BibTeX

@inproceedings{gebhardt2017aaai-kernel,
  title     = {{The Kernel Kalman Rule - Efficient Nonparametric Inference with Recursive Least Squares}},
  author    = {Gebhardt, Gregor H. W. and Kupcsik, Andras Gabor and Neumann, Gerhard},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2017},
  pages     = {3754-3760},
  doi       = {10.1609/AAAI.V31I1.11051},
  url       = {https://mlanthology.org/aaai/2017/gebhardt2017aaai-kernel/}
}