Texture Synthesis Using Convolutional Neural Networks

Abstract

Here we introduce a new model of natural textures based on the feature spaces of convolutional neural networks optimised for object recognition. Samples from the model are of high perceptual quality demonstrating the generative power of neural networks trained in a purely discriminative fashion. Within the model, textures are represented by the correlations between feature maps in several layers of the network. We show that across layers the texture representations increasingly capture the statistical properties of natural images while making object information more and more explicit. The model provides a new tool to generate stimuli for neuroscience and might offer insights into the deep representations learned by convolutional neural networks.

Cite

Text

Gatys et al. "Texture Synthesis Using Convolutional Neural Networks." Neural Information Processing Systems, 2015.

Markdown

[Gatys et al. "Texture Synthesis Using Convolutional Neural Networks." Neural Information Processing Systems, 2015.](https://mlanthology.org/neurips/2015/gatys2015neurips-texture/)

BibTeX

@inproceedings{gatys2015neurips-texture,
  title     = {{Texture Synthesis Using Convolutional Neural Networks}},
  author    = {Gatys, Leon and Ecker, Alexander S and Bethge, Matthias},
  booktitle = {Neural Information Processing Systems},
  year      = {2015},
  pages     = {262-270},
  url       = {https://mlanthology.org/neurips/2015/gatys2015neurips-texture/}
}