PRISM: Probabilistic Real-Time Inference in Spatial World Models
Abstract
We introduce PRISM, a method for real-time filtering in a probabilistic generative model of agent motion and visual perception. Previous approaches either lack uncertainty estimates for the map and agent state, do not run in real-time, do not have a dense scene representation or do not model agent dynamics. Our solution reconciles all of these aspects. We start from a predefined state-space model which combines differentiable rendering and 6-DoF dynamics. Probabilistic inference in this model amounts to simultaneous localisation and mapping (SLAM) and is intractable. We use a series of approximations to Bayesian inference to arrive at probabilistic map and state estimates. We take advantage of well-established methods and closed-form updates, preserving accuracy and enabling real-time capability. The proposed solution runs at 10Hz real-time and is similarly accurate to state-of-the-art SLAM in small to medium-sized indoor environments, with high-speed UAV and handheld camera agents (Blackbird, EuRoC and TUM-RGBD).
Cite
Text
Mirchev et al. "PRISM: Probabilistic Real-Time Inference in Spatial World Models." Conference on Robot Learning, 2022.Markdown
[Mirchev et al. "PRISM: Probabilistic Real-Time Inference in Spatial World Models." Conference on Robot Learning, 2022.](https://mlanthology.org/corl/2022/mirchev2022corl-prism/)BibTeX
@inproceedings{mirchev2022corl-prism,
title = {{PRISM: Probabilistic Real-Time Inference in Spatial World Models}},
author = {Mirchev, Atanas and Kayalibay, Baris and Agha, Ahmed and van der Smagt, Patrick and Cremers, Daniel and Bayer, Justin},
booktitle = {Conference on Robot Learning},
year = {2022},
pages = {161-174},
volume = {205},
url = {https://mlanthology.org/corl/2022/mirchev2022corl-prism/}
}