Equivariance Through Parameter-Sharing
Abstract
We propose to study equivariance in deep neural networks through parameter symmetries. In particular, given a group G that acts discretely on the input and output of a standard neural network layer, we show that its equivariance is linked to the symmetry group of network parameters. We then propose two parameter-sharing scheme to induce the desirable symmetry on the parameters of the neural network. Under some conditions on the action of G, our procedure for tying the parameters achieves G-equivariance and guarantees sensitivity to all other permutation groups outside of G.
Cite
Text
Ravanbakhsh et al. "Equivariance Through Parameter-Sharing." International Conference on Machine Learning, 2017.Markdown
[Ravanbakhsh et al. "Equivariance Through Parameter-Sharing." International Conference on Machine Learning, 2017.](https://mlanthology.org/icml/2017/ravanbakhsh2017icml-equivariance/)BibTeX
@inproceedings{ravanbakhsh2017icml-equivariance,
title = {{Equivariance Through Parameter-Sharing}},
author = {Ravanbakhsh, Siamak and Schneider, Jeff and Póczos, Barnabás},
booktitle = {International Conference on Machine Learning},
year = {2017},
pages = {2892-2901},
volume = {70},
url = {https://mlanthology.org/icml/2017/ravanbakhsh2017icml-equivariance/}
}