REFRAME: Reflective Surface Real-Time Rendering for Mobile Devices

Abstract

This work tackles the challenging task of achieving real-time novel view synthesis for reflective surfaces across various scenes. Existing real-time rendering methods, especially those based on meshes, often have subpar performance in modeling surfaces with rich view-dependent appearances. Our key idea lies in leveraging meshes for rendering acceleration while incorporating a novel approach to parameterize view-dependent information. We decompose the color into diffuse and specular, and model the specular color in the reflected direction based on a neural environment map. Our experiments demonstrate that our method achieves comparable reconstruction quality for highly reflective surfaces compared to state-of-the-art offline methods, while also efficiently enabling real-time rendering on edge devices such as smartphones. Our project page is at https://xdimlab.github.io/REFRAME/.

Cite

Text

Ji et al. "REFRAME: Reflective Surface Real-Time Rendering for Mobile Devices." Proceedings of the European Conference on Computer Vision (ECCV), 2024. doi:10.1007/978-3-031-72995-9_14

Markdown

[Ji et al. "REFRAME: Reflective Surface Real-Time Rendering for Mobile Devices." Proceedings of the European Conference on Computer Vision (ECCV), 2024.](https://mlanthology.org/eccv/2024/ji2024eccv-reframe/) doi:10.1007/978-3-031-72995-9_14

BibTeX

@inproceedings{ji2024eccv-reframe,
  title     = {{REFRAME: Reflective Surface Real-Time Rendering for Mobile Devices}},
  author    = {Ji, Chaojie and Li, Yufeng and Liao, Yiyi},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2024},
  doi       = {10.1007/978-3-031-72995-9_14},
  url       = {https://mlanthology.org/eccv/2024/ji2024eccv-reframe/}
}