ODG: Occupancy Prediction Using Dual Gaussians
Abstract
Occupancy prediction infers fine-grained 3D geometry and semantics from camera images of the surrounding environment, making it a critical perception task for autonomous driving. Existing methods either adopt dense grids as scene representation which is difficult to scale to high resolution, or learn the entire scene using a single set of sparse queries, which is insufficient to handle the various object characteristics. In this paper, we present ODG, a hierarchical dual sparse Gaussian representation to effectively capture complex scene dynamics. Building upon the observation that driving scenes can be universally decomposed into static and dynamic counterparts, we define dual Gaussian queries to better model the diverse scene objects. We utilize a hierarchical Gaussian transformer to predict the occupied voxel centers and semantic classes along with the Gaussian parameters. Leveraging the real-time rendering capability of 3D Gaussian Splatting, we also impose rendering supervision with available depth and semantic map annotations injecting pixel-level alignment to boost occupancy learning. Extensive experiments on the Occ3D-nuScenes and Occ3D-Waymo benchmarks demonstrate our proposed method sets new state-of-the-art results while maintaining low inference cost.
Cite
Text
Shi et al. "ODG: Occupancy Prediction Using Dual Gaussians." Advances in Neural Information Processing Systems, 2025.Markdown
[Shi et al. "ODG: Occupancy Prediction Using Dual Gaussians." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/shi2025neurips-odg/)BibTeX
@inproceedings{shi2025neurips-odg,
title = {{ODG: Occupancy Prediction Using Dual Gaussians}},
author = {Shi, Yunxiao and Zhu, Yinhao and Cai, Hong and Han, Shizhong and Jeong, Jisoo and Ansari, Amin and Porikli, Fatih},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/shi2025neurips-odg/}
}