Data-Dependent Path Normalization in Neural Networks

Abstract

We propose a unified framework for neural net normalization, regularization and optimization, which includes Path-SGD and Batch-Normalization and interpolates between them across two different dimensions. Through this framework we investigate issue of invariance of the optimization, data dependence and the connection with natural gradients.

Cite

Text

Neyshabur et al. "Data-Dependent Path Normalization in Neural Networks." International Conference on Learning Representations, 2016.

Markdown

[Neyshabur et al. "Data-Dependent Path Normalization in Neural Networks." International Conference on Learning Representations, 2016.](https://mlanthology.org/iclr/2016/neyshabur2016iclr-data/)

BibTeX

@inproceedings{neyshabur2016iclr-data,
  title     = {{Data-Dependent Path Normalization in Neural Networks}},
  author    = {Neyshabur, Behnam and Tomioka, Ryota and Salakhutdinov, Ruslan and Srebro, Nathan},
  booktitle = {International Conference on Learning Representations},
  year      = {2016},
  url       = {https://mlanthology.org/iclr/2016/neyshabur2016iclr-data/}
}