Adversarial Training for Predictive Tasks: Theoretical Analysis and Limitations in the Deterministic Case.

Abstract

To train a deep neural network to mimic the outcomes of processing sequences, a version of Conditional Generalized Adversarial Network (CGAN) can be used. It has been observed by others that CGAN can help to improve the results even for deterministic sequences, where only one output is associated with the processing of a given input. Surprisingly, our CGAN-based tests on deterministic geophysical processing sequences did not produce a real improvement compared to the use of an $L_p$ loss; we here propose a first theoretical explanation why. Our analysis goes from the non-deterministic case to the deterministic one. It led us to develop an adversarial way to train a content loss that gave better results on our data.

Cite

Text

Lesieur et al. "Adversarial Training for Predictive Tasks: Theoretical Analysis and Limitations in the Deterministic Case.." NeurIPS 2020 Workshops: ICBINB, 2020.

Markdown

[Lesieur et al. "Adversarial Training for Predictive Tasks: Theoretical Analysis and Limitations in the Deterministic Case.." NeurIPS 2020 Workshops: ICBINB, 2020.](https://mlanthology.org/neuripsw/2020/lesieur2020neuripsw-adversarial/)

BibTeX

@inproceedings{lesieur2020neuripsw-adversarial,
  title     = {{Adversarial Training for Predictive Tasks: Theoretical Analysis and Limitations in the Deterministic Case.}},
  author    = {Lesieur, Thibault and Messud, Jérémie and Hammoud, Issa and Peng, Hanyuan and Lacombe, Céline and Jeunesse, Paulien},
  booktitle = {NeurIPS 2020 Workshops: ICBINB},
  year      = {2020},
  url       = {https://mlanthology.org/neuripsw/2020/lesieur2020neuripsw-adversarial/}
}