Synthetic Gradient Methods with Virtual Forward-Backward Networks

Abstract

The concept of synthetic gradient introduced by Jaderberg et al. (2016) provides an avant-garde framework for asynchronous learning of neural network. Their model, however, has a weakness in its construction, because the structure of their synthetic gradient has little relation to the objective function of the target task. In this paper we introduce virtual forward-backward networks (VFBN). VFBN is a model that produces synthetic gradient whose structure is analogous to the actual gradient of the objective function. VFBN is the first of its kind that succeeds in decoupling deep networks like ResNet-110 (He et al., 2016) without compromising its performance.

Cite

Text

Miyato et al. "Synthetic Gradient Methods with Virtual Forward-Backward Networks." International Conference on Learning Representations, 2017.

Markdown

[Miyato et al. "Synthetic Gradient Methods with Virtual Forward-Backward Networks." International Conference on Learning Representations, 2017.](https://mlanthology.org/iclr/2017/miyato2017iclr-synthetic/)

BibTeX

@inproceedings{miyato2017iclr-synthetic,
  title     = {{Synthetic Gradient Methods with Virtual Forward-Backward Networks}},
  author    = {Miyato, Takeru and Okanohara, Daisuke and Maeda, Shin-ichi and Koyama, Masanori},
  booktitle = {International Conference on Learning Representations},
  year      = {2017},
  url       = {https://mlanthology.org/iclr/2017/miyato2017iclr-synthetic/}
}