Availability-Aware Sensor Fusion via Unified Canonical Space
Abstract
Sensor fusion of camera, LiDAR, and 4-dimensional (4D) Radar has brought a significant performance improvement in autonomous driving. However, there still exist fundamental challenges: deeply coupled fusion methods assume continuous sensor availability, making them vulnerable to sensor degradation and failure, whereas sensor-wise cross-attention fusion methods struggle with computational cost and unified feature representation. This paper presents availability-aware sensor fusion (ASF), a novel method that employs unified canonical projection (UCP) to enable consistency in all sensor features for fusion and cross-attention across sensors along patches (CASAP) to enhance robustness of sensor fusion against sensor degradation and failure. As a result, the proposed ASF shows a superior object detection performance to the existing state-of-the-art fusion methods under various weather and sensor degradation (or failure) conditions. Extensive experiments on the K-Radar dataset demonstrate that ASF achieves improvements of 9.7\% in $AP_{BEV}$ (87.2\%) and 20.1\% in $AP_{3D}$ (73.6\%) in object detection at IoU=0.5, while requiring a low computational cost. All codes are available at https://github.com/kaist-avelab/k-radar.
Cite
Text
Paek and Kong. "Availability-Aware Sensor Fusion via Unified Canonical Space." Advances in Neural Information Processing Systems, 2025.Markdown
[Paek and Kong. "Availability-Aware Sensor Fusion via Unified Canonical Space." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/paek2025neurips-availabilityaware/)BibTeX
@inproceedings{paek2025neurips-availabilityaware,
title = {{Availability-Aware Sensor Fusion via Unified Canonical Space}},
author = {Paek, Dong-Hee and Kong, Seung-Hyun},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/paek2025neurips-availabilityaware/}
}