Rethinking Neural Operations for Diverse Tasks

Abstract

An important goal of AutoML is to automate-away the design of neural networks on new tasks in under-explored domains. Motivated by this goal, we study the problem of enabling users to discover the right neural operations given data from their specific domain. We introduce a search space of operations called XD-Operations that mimic the inductive bias of standard multi-channel convolutions while being much more expressive: we prove that it includes many named operations across multiple application areas. Starting with any standard backbone such as ResNet, we show how to transform it into a search space over XD-operations and how to traverse the space using a simple weight sharing scheme. On a diverse set of tasks—solving PDEs, distance prediction for protein folding, and music modeling—our approach consistently yields models with lower error than baseline networks and often even lower error than expert-designed domain-specific approaches.

Cite

Text

Roberts et al. "Rethinking Neural Operations for Diverse Tasks." Neural Information Processing Systems, 2021.

Markdown

[Roberts et al. "Rethinking Neural Operations for Diverse Tasks." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/roberts2021neurips-rethinking/)

BibTeX

@inproceedings{roberts2021neurips-rethinking,
  title     = {{Rethinking Neural Operations for Diverse Tasks}},
  author    = {Roberts, Nicholas and Khodak, Mikhail and Dao, Tri and Li, Liam and Ré, Christopher and Talwalkar, Ameet},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/roberts2021neurips-rethinking/}
}