Attentive Conditional Channel-Recurrent Autoencoding for Attribute-Conditioned Face Synthesis

Abstract

Attribute-conditioned face synthesis has many potential use cases, such as to aid the identification of a suspect or a missing person. Building on top of a conditional version of VAE-GAN, we augment the pathways connecting the latent space with channel-recurrent architecture, in order to provide not only improved generation qualities but also interpretable high-level features. In particular, to better achieve the latter, we further propose an attention mechanism over each attribute to indicate the specific latent subset responsible for its modulation. Thanks to the latent semantics formed via the channel-recurreny, we envision a tool that takes the desired attributes as inputs and then performs a 2-stage general-to-specific generation of diverse and realistic faces. Lastly, we incorporate the progressive-growth training scheme to the inference, generation and discriminator networks of our models to facilitate higher resolution outputs. Evaluations are performed through both qualitative visual examination and quantitative metrics, namely inception scores, human preferences, and attribute classification accuracy.

Cite

Text

Shang and Sohn. "Attentive Conditional Channel-Recurrent Autoencoding for Attribute-Conditioned Face Synthesis." IEEE/CVF Winter Conference on Applications of Computer Vision, 2019. doi:10.1109/WACV.2019.00168

Markdown

[Shang and Sohn. "Attentive Conditional Channel-Recurrent Autoencoding for Attribute-Conditioned Face Synthesis." IEEE/CVF Winter Conference on Applications of Computer Vision, 2019.](https://mlanthology.org/wacv/2019/shang2019wacv-attentive/) doi:10.1109/WACV.2019.00168

BibTeX

@inproceedings{shang2019wacv-attentive,
  title     = {{Attentive Conditional Channel-Recurrent Autoencoding for Attribute-Conditioned Face Synthesis}},
  author    = {Shang, Wenling and Sohn, Kihyuk},
  booktitle = {IEEE/CVF Winter Conference on Applications of Computer Vision},
  year      = {2019},
  pages     = {1533-1542},
  doi       = {10.1109/WACV.2019.00168},
  url       = {https://mlanthology.org/wacv/2019/shang2019wacv-attentive/}
}