Completeness and Coherence Learning for Fast Arbitrary Style Transfer
Abstract
Style transfer methods put a premium on two objectives: (1) completeness which encourages the encoding of a complete set of style patterns; (2) coherence which discourages the production of spurious artifacts not found in input styles. While existing methods pursue the two objectives either partially or implicitly, we present the Completeness and Coherence Network (CCNet) which jointly learns completeness and coherence components and rejects their incompatibility, both in an explicit manner. Specifically, we develop an attention mechanism integrated with bi-directional softmax operations for explicit imposition of the two objectives and for their collaborative modelling. We also propose CCLoss as a quantitative measure for evaluating the quality of a stylized image in terms of completeness and coherence. Through an empirical evaluation, we demonstrate that compared with existing methods, our method strikes a better tradeoff between computation costs, generalization ability and stylization quality.
Cite
Text
Wu et al. "Completeness and Coherence Learning for Fast Arbitrary Style Transfer." Transactions on Machine Learning Research, 2022.Markdown
[Wu et al. "Completeness and Coherence Learning for Fast Arbitrary Style Transfer." Transactions on Machine Learning Research, 2022.](https://mlanthology.org/tmlr/2022/wu2022tmlr-completeness/)BibTeX
@article{wu2022tmlr-completeness,
title = {{Completeness and Coherence Learning for Fast Arbitrary Style Transfer}},
author = {Wu, Zhijie and Song, Chunjin and Chen, Guanxiong and Guo, Sheng and Huang, Weilin},
journal = {Transactions on Machine Learning Research},
year = {2022},
url = {https://mlanthology.org/tmlr/2022/wu2022tmlr-completeness/}
}