Ref-NPR: Reference-Based Non-Photorealistic Radiance Fields for Controllable Scene Stylization
Abstract
Current 3D scene stylization methods transfer textures and colors as styles using arbitrary style references, lacking meaningful semantic correspondences. We introduce Reference-Based Non-Photorealistic Radiance Fields (Ref-NPR) to address this limitation. This controllable method stylizes a 3D scene using radiance fields with a single stylized 2D view as a reference. We propose a ray registration process based on the stylized reference view to obtain pseudo-ray supervision in novel views. Then we exploit semantic correspondences in content images to fill occluded regions with perceptually similar styles, resulting in non-photorealistic and continuous novel view sequences. Our experimental results demonstrate that Ref-NPR outperforms existing scene and video stylization methods regarding visual quality and semantic correspondence. The code and data are publicly available on the project page at https://ref-npr.github.io.
Cite
Text
Zhang et al. "Ref-NPR: Reference-Based Non-Photorealistic Radiance Fields for Controllable Scene Stylization." Conference on Computer Vision and Pattern Recognition, 2023. doi:10.1109/CVPR52729.2023.00413Markdown
[Zhang et al. "Ref-NPR: Reference-Based Non-Photorealistic Radiance Fields for Controllable Scene Stylization." Conference on Computer Vision and Pattern Recognition, 2023.](https://mlanthology.org/cvpr/2023/zhang2023cvpr-refnpr/) doi:10.1109/CVPR52729.2023.00413BibTeX
@inproceedings{zhang2023cvpr-refnpr,
title = {{Ref-NPR: Reference-Based Non-Photorealistic Radiance Fields for Controllable Scene Stylization}},
author = {Zhang, Yuechen and He, Zexin and Xing, Jinbo and Yao, Xufeng and Jia, Jiaya},
booktitle = {Conference on Computer Vision and Pattern Recognition},
year = {2023},
pages = {4242-4251},
doi = {10.1109/CVPR52729.2023.00413},
url = {https://mlanthology.org/cvpr/2023/zhang2023cvpr-refnpr/}
}