Temporal FiLM: Capturing Long-Range Sequence Dependencies with Feature-Wise Modulations.

Abstract

Learning representations that accurately capture long-range dependencies in sequential inputs --- including text, audio, and genomic data --- is a key problem in deep learning. Feed-forward convolutional models capture only feature interactions within finite receptive fields while recurrent architectures can be slow and difficult to train due to vanishing gradients. Here, we propose Temporal Feature-Wise Linear Modulation (TFiLM) --- a novel architectural component inspired by adaptive batch normalization and its extensions --- that uses a recurrent neural network to alter the activations of a convolutional model. This approach expands the receptive field of convolutional sequence models with minimal computational overhead. Empirically, we find that TFiLM significantly improves the learning speed and accuracy of feed-forward neural networks on a range of generative and discriminative learning tasks, including text classification and audio super-resolution.

Cite

Text

Birnbaum et al. "Temporal FiLM: Capturing Long-Range Sequence Dependencies with Feature-Wise Modulations.." Neural Information Processing Systems, 2019.

Markdown

[Birnbaum et al. "Temporal FiLM: Capturing Long-Range Sequence Dependencies with Feature-Wise Modulations.." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/birnbaum2019neurips-temporal/)

BibTeX

@inproceedings{birnbaum2019neurips-temporal,
  title     = {{Temporal FiLM: Capturing Long-Range Sequence Dependencies with Feature-Wise Modulations.}},
  author    = {Birnbaum, Sawyer and Kuleshov, Volodymyr and Enam, Zayd and Koh, Pang Wei W and Ermon, Stefano},
  booktitle = {Neural Information Processing Systems},
  year      = {2019},
  pages     = {10287-10298},
  url       = {https://mlanthology.org/neurips/2019/birnbaum2019neurips-temporal/}
}