Feature-Wise Transformations

Abstract

Distill articles are interactive publications and do not include traditional abstracts. This summary was written for the ML Anthology. Surveys feature-wise linear modulation (FiLM) and related conditioning mechanisms that modulate neural network features through element-wise scaling and shifting. Demonstrates their effectiveness across visual reasoning, style transfer, language modeling, and reinforcement learning.

Cite

Text

Dumoulin et al. "Feature-Wise Transformations." Distill, 2018. doi:10.23915/distill.00011

Markdown

[Dumoulin et al. "Feature-Wise Transformations." Distill, 2018.](https://mlanthology.org/distill/2018/dumoulin2018distill-featurewise/) doi:10.23915/distill.00011

BibTeX

@article{dumoulin2018distill-featurewise,
  title     = {{Feature-Wise Transformations}},
  author    = {Dumoulin, Vincent and Perez, Ethan and Schucher, Nathan and Strub, Florian and de Vries, Harm and Courville, Aaron and Bengio, Yoshua},
  journal   = {Distill},
  year      = {2018},
  doi       = {10.23915/distill.00011},
  url       = {https://mlanthology.org/distill/2018/dumoulin2018distill-featurewise/}
}