HyperNST: Hyper-Networks for Neural Style Transfer

Abstract

We present HyperNST; a neural style transfer (NST) technique for the artistic stylization of images, based on Hyper-networks and the StyleGAN2 architecture. Our contribution is a novel method for inducing style transfer parameterized by a metric space, pre-trained for style-based visual search (SBVS). We show for the first time that such space may be used to drive NST, enabling the application and interpolation of styles from an SBVS system. The technical contribution is a hyper-network that predicts weight updates to a StyleGAN2 pre-trained over a diverse gamut of artistic content (portraits), tailoring the style parameterization on a per-region basis using a semantic map of the facial regions. We show HyperNST to exceed state of the art in content preservation for our stylized content while retaining good style transfer performance.

Cite

Text

Ruta et al. "HyperNST: Hyper-Networks for Neural Style Transfer." European Conference on Computer Vision Workshops, 2022. doi:10.1007/978-3-031-25056-9_14

Markdown

[Ruta et al. "HyperNST: Hyper-Networks for Neural Style Transfer." European Conference on Computer Vision Workshops, 2022.](https://mlanthology.org/eccvw/2022/ruta2022eccvw-hypernst/) doi:10.1007/978-3-031-25056-9_14

BibTeX

@inproceedings{ruta2022eccvw-hypernst,
  title     = {{HyperNST: Hyper-Networks for Neural Style Transfer}},
  author    = {Ruta, Dan and Gilbert, Andrew and Motiian, Saeid and Faieta, Baldo and Lin, Zhe and Collomosse, John P.},
  booktitle = {European Conference on Computer Vision Workshops},
  year      = {2022},
  pages     = {201-217},
  doi       = {10.1007/978-3-031-25056-9_14},
  url       = {https://mlanthology.org/eccvw/2022/ruta2022eccvw-hypernst/}
}