Naturally Occurring Equivariance in Neural Networks
Abstract
Distill articles are interactive publications and do not include traditional abstracts. This summary was written for the ML Anthology. Demonstrates how neural networks naturally learn multiple transformed copies of the same feature connected by symmetric weights, showing that equivariance emerges organically in trained networks without being explicitly designed.
Cite
Text
Olah et al. "Naturally Occurring Equivariance in Neural Networks." Distill, 2020. doi:10.23915/distill.00024.004Markdown
[Olah et al. "Naturally Occurring Equivariance in Neural Networks." Distill, 2020.](https://mlanthology.org/distill/2020/olah2020distill-naturally/) doi:10.23915/distill.00024.004BibTeX
@article{olah2020distill-naturally,
title = {{Naturally Occurring Equivariance in Neural Networks}},
author = {Olah, Chris and Cammarata, Nick and Voss, Chelsea and Schubert, Ludwig and Goh, Gabriel},
journal = {Distill},
year = {2020},
doi = {10.23915/distill.00024.004},
url = {https://mlanthology.org/distill/2020/olah2020distill-naturally/}
}