U2RLE: Uncertainty-Guided 2-Stage Room Layout Estimation
Abstract
While the existing deep learning-based room layout estimation techniques demonstrate good overall accuracy [17], they are less effective for distant floor-wall boundary. To tackle this problem, we propose a novel uncertainty-guided approach for layout boundary estimation introducing new two-stage CNN architecture termed U2RLE. The initial stage predicts both floor-wall boundary and its uncertainty and is followed by the refinement of boundaries with high positional uncertainty using a different, distance-aware loss. Finally, outputs from the two stages are merged to produce the room layout. Experiments using ZInD [4] and Structure3D [25] datasets show that U2RLE improves over current state-of-the-art, being able to handle both near and far walls better. In particular, U2RLE outperforms current state-of-the-art techniques for the most distant walls.
Cite
Text
Fayyazsanavi et al. "U2RLE: Uncertainty-Guided 2-Stage Room Layout Estimation." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2023. doi:10.1109/CVPRW59228.2023.00364Markdown
[Fayyazsanavi et al. "U2RLE: Uncertainty-Guided 2-Stage Room Layout Estimation." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2023.](https://mlanthology.org/cvprw/2023/fayyazsanavi2023cvprw-u2rle/) doi:10.1109/CVPRW59228.2023.00364BibTeX
@inproceedings{fayyazsanavi2023cvprw-u2rle,
title = {{U2RLE: Uncertainty-Guided 2-Stage Room Layout Estimation}},
author = {Fayyazsanavi, Pooya and Wan, Zhiqiang and Hutchcroft, Will and Boyadzhiev, Ivaylo and Li, Yuguang and Kosecka, Jana and Kang, Sing Bing},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2023},
pages = {3562-3570},
doi = {10.1109/CVPRW59228.2023.00364},
url = {https://mlanthology.org/cvprw/2023/fayyazsanavi2023cvprw-u2rle/}
}