Framing RNN as a Kernel Method: A Neural ODE Approach

Abstract

Building on the interpretation of a recurrent neural network (RNN) as a continuous-time neural differential equation, we show, under appropriate conditions, that the solution of a RNN can be viewed as a linear function of a specific feature set of the input sequence, known as the signature. This connection allows us to frame a RNN as a kernel method in a suitable reproducing kernel Hilbert space. As a consequence, we obtain theoretical guarantees on generalization and stability for a large class of recurrent networks. Our results are illustrated on simulated datasets.

Cite

Text

Fermanian et al. "Framing RNN as a Kernel Method: A Neural ODE Approach." Neural Information Processing Systems, 2021.

Markdown

[Fermanian et al. "Framing RNN as a Kernel Method: A Neural ODE Approach." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/fermanian2021neurips-framing/)

BibTeX

@inproceedings{fermanian2021neurips-framing,
  title     = {{Framing RNN as a Kernel Method: A Neural ODE Approach}},
  author    = {Fermanian, Adeline and Marion, Pierre and Vert, Jean-Philippe and Biau, Gérard},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/fermanian2021neurips-framing/}
}