StyLitGAN: Image-Based Relighting via Latent Control

Abstract

We describe a novel method StyLitGAN for relighting and resurfacing images in the absence of labeled data. StyLitGAN generates images with realistic lighting effects including cast shadows soft shadows inter-reflections and glossy effects without the need for paired or CGI data. StyLitGAN uses an intrinsic image method to decompose an image followed by a search of the latent space of a pretrained StyleGAN to identify a set of directions. By prompting the model to fix one component (e.g. albedo) and vary another (e.g. shading) we generate relighted images by adding the identified directions to the latent style codes. Quantitative metrics of change in albedo and lighting diversity allow us to choose effective directions using a forward selection process. Qualitative evaluation confirms the effectiveness of our method.

Cite

Text

Bhattad et al. "StyLitGAN: Image-Based Relighting via Latent Control." Conference on Computer Vision and Pattern Recognition, 2024. doi:10.1109/CVPR52733.2024.00405

Markdown

[Bhattad et al. "StyLitGAN: Image-Based Relighting via Latent Control." Conference on Computer Vision and Pattern Recognition, 2024.](https://mlanthology.org/cvpr/2024/bhattad2024cvpr-stylitgan/) doi:10.1109/CVPR52733.2024.00405

BibTeX

@inproceedings{bhattad2024cvpr-stylitgan,
  title     = {{StyLitGAN: Image-Based Relighting via Latent Control}},
  author    = {Bhattad, Anand and Soole, James and Forsyth, D.A.},
  booktitle = {Conference on Computer Vision and Pattern Recognition},
  year      = {2024},
  pages     = {4231-4240},
  doi       = {10.1109/CVPR52733.2024.00405},
  url       = {https://mlanthology.org/cvpr/2024/bhattad2024cvpr-stylitgan/}
}