Backforward Propagation (Student Abstract)
Abstract
In this paper we introduce Backforward Propagation, a method of completely eliminating Internal Covariate Shift (ICS). Unlike previous methods, which only indirectly reduce the impact of ICS while introducing other biases, we are able to have a surgical view at the effects ICS has on training neural networks. Our experiments show that ICS has a weight regularizing effect on models, and completely removing it enables for faster convergence of the neural network.
Cite
Text
Stoica and Simionescu. "Backforward Propagation (Student Abstract)." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I13.27029Markdown
[Stoica and Simionescu. "Backforward Propagation (Student Abstract)." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/stoica2023aaai-backforward/) doi:10.1609/AAAI.V37I13.27029BibTeX
@inproceedings{stoica2023aaai-backforward,
title = {{Backforward Propagation (Student Abstract)}},
author = {Stoica, George and Simionescu, Cristian},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {16338-16339},
doi = {10.1609/AAAI.V37I13.27029},
url = {https://mlanthology.org/aaai/2023/stoica2023aaai-backforward/}
}