A Deep Architecture for Log-Linear Models

Abstract

We present a novel perspective on deep learning architectures using a partial order structure, which is naturally incorporated into the information geometric formulation of the log-linear model. Our formulation provides a different perspective of deep learning by realizing the bias and weights as different layers on our partial order structure. This formulation of the neural network does not require any gradients and can efficiently estimate the parameters using the EM algorithm.

Cite

Text

Luo et al. "A Deep Architecture for Log-Linear Models." NeurIPS 2020 Workshops: DL-IG, 2020.

Markdown

[Luo et al. "A Deep Architecture for Log-Linear Models." NeurIPS 2020 Workshops: DL-IG, 2020.](https://mlanthology.org/neuripsw/2020/luo2020neuripsw-deep/)

BibTeX

@inproceedings{luo2020neuripsw-deep,
  title     = {{A Deep Architecture for Log-Linear Models}},
  author    = {Luo, Simon and Cripps, Sally and Sugiyama, Mahito},
  booktitle = {NeurIPS 2020 Workshops: DL-IG},
  year      = {2020},
  url       = {https://mlanthology.org/neuripsw/2020/luo2020neuripsw-deep/}
}