Connecting Weighted Automata and Recurrent Neural Networks Through Spectral Learning
Abstract
In this paper, we unravel a fundamental connection between weighted finite automata (WFAs) and second-order recurrent neural networks (2-RNNs): in the case of sequences of discrete symbols, WFAs and 2-RNNs with linear activation functions are expressively equivalent. Motivated by this result, we build upon a recent extension of the spectral learning algorithm to vector-valued WFAs and propose the first provable learning algorithm for linear 2-RNNs defined over sequences of continuous input vectors. This algorithm relies on estimating low rank sub-blocks of the so-called Hankel tensor, from which the parameters of a linear 2-RNN can be provably recovered. The performances of the proposed method are assessed in a simulation study.
Cite
Text
Rabusseau et al. "Connecting Weighted Automata and Recurrent Neural Networks Through Spectral Learning." Artificial Intelligence and Statistics, 2019.Markdown
[Rabusseau et al. "Connecting Weighted Automata and Recurrent Neural Networks Through Spectral Learning." Artificial Intelligence and Statistics, 2019.](https://mlanthology.org/aistats/2019/rabusseau2019aistats-connecting/)BibTeX
@inproceedings{rabusseau2019aistats-connecting,
title = {{Connecting Weighted Automata and Recurrent Neural Networks Through Spectral Learning}},
author = {Rabusseau, Guillaume and Li, Tianyu and Precup, Doina},
booktitle = {Artificial Intelligence and Statistics},
year = {2019},
pages = {1630-1639},
volume = {89},
url = {https://mlanthology.org/aistats/2019/rabusseau2019aistats-connecting/}
}