All-to-Key Attention for Arbitrary Style Transfer
Abstract
Attention-based arbitrary style transfer studies have shown promising performance in synthesizing vivid local style details. They typically use the all-to-all attention mechanism---each position of content features is fully matched to all positions of style features. However, all-to-all attention tends to generate distorted style patterns and has quadratic complexity, limiting the effectiveness and efficiency of arbitrary style transfer. In this paper, we propose a novel all-to-key attention mechanism---each position of content features is matched to stable key positions of style features---that is more in line with the characteristics of style transfer. Specifically, it integrates two newly proposed attention forms: distributed and progressive attention. Distributed attention assigns attention to key style representations that depict the style distribution of local regions; Progressive attention pays attention from coarse-grained regions to fine-grained key positions. The resultant module, dubbed StyA2K, shows extraordinary performance in preserving the semantic structure and rendering consistent style patterns. Qualitative and quantitative comparisons with state-of-the-art methods demonstrate the superior performance of our approach. Codes and models are available on https://github.com/LearningHx/StyA2K.
Cite
Text
Zhu et al. "All-to-Key Attention for Arbitrary Style Transfer." International Conference on Computer Vision, 2023. doi:10.1109/ICCV51070.2023.02112Markdown
[Zhu et al. "All-to-Key Attention for Arbitrary Style Transfer." International Conference on Computer Vision, 2023.](https://mlanthology.org/iccv/2023/zhu2023iccv-alltokey/) doi:10.1109/ICCV51070.2023.02112BibTeX
@inproceedings{zhu2023iccv-alltokey,
title = {{All-to-Key Attention for Arbitrary Style Transfer}},
author = {Zhu, Mingrui and He, Xiao and Wang, Nannan and Wang, Xiaoyu and Gao, Xinbo},
booktitle = {International Conference on Computer Vision},
year = {2023},
pages = {23109-23119},
doi = {10.1109/ICCV51070.2023.02112},
url = {https://mlanthology.org/iccv/2023/zhu2023iccv-alltokey/}
}