Name Your Style: Text-Guided Artistic Style Transfer
Abstract
Image style transfer has attracted widespread attention in the past years. Despite its remarkable results, it requires additional style images available as references, making it less flexible and inconvenient. Using text is the most natural way to describe the style. Text can describe implicit abstract styles, like styles of specific artists or art movements. In this work, we propose a text-driven style transfer (TxST) that leverages advanced image-text encoders to control arbitrary style transfer. We introduce a contrastive training strategy to effectively extract style descriptions from the image-text model (i.e., CLIP), which aligns stylization with the text description. To this end, we also propose a novel cross-attention module to fuse style and content features. Finally, we achieve an arbitrary artist-aware style transfer to learn and transfer specific artistic characters such as Picasso, oil painting, or a rough sketch. Extensive experiments demonstrate that our approach outperforms the state-of-the-art methods. Moreover, it can mimic the styles of one or many artists to achieve attractive results, thus highlighting a promising future direction.
Cite
Text
Liu et al. "Name Your Style: Text-Guided Artistic Style Transfer." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2023. doi:10.1109/CVPRW59228.2023.00359Markdown
[Liu et al. "Name Your Style: Text-Guided Artistic Style Transfer." IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2023.](https://mlanthology.org/cvprw/2023/liu2023cvprw-name/) doi:10.1109/CVPRW59228.2023.00359BibTeX
@inproceedings{liu2023cvprw-name,
title = {{Name Your Style: Text-Guided Artistic Style Transfer}},
author = {Liu, Zhi-Song and Wang, Li-Wen and Siu, Wan-Chi and Kalogeiton, Vicky},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops},
year = {2023},
pages = {3530-3534},
doi = {10.1109/CVPRW59228.2023.00359},
url = {https://mlanthology.org/cvprw/2023/liu2023cvprw-name/}
}