Multi-Zone Unit for Recurrent Neural Networks
Abstract
Recurrent neural networks (RNNs) have been widely used to deal with sequence learning problems. The input-dependent transition function, which folds new observations into hidden states to sequentially construct fixed-length representations of arbitrary-length sequences, plays a critical role in RNNs. Based on single space composition, transition functions in existing RNNs often have difficulty in capturing complicated long-range dependencies. In this paper, we introduce a new Multi-zone Unit (MZU) for RNNs. The key idea is to design a transition function that is capable of modeling multiple space composition. The MZU consists of three components: zone generation, zone composition, and zone aggregation. Experimental results on multiple datasets of the character-level language modeling task and the aspect-based sentiment analysis task demonstrate the superiority of the MZU.
Cite
Text
Meng et al. "Multi-Zone Unit for Recurrent Neural Networks." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I04.5958Markdown
[Meng et al. "Multi-Zone Unit for Recurrent Neural Networks." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/meng2020aaai-multi/) doi:10.1609/AAAI.V34I04.5958BibTeX
@inproceedings{meng2020aaai-multi,
title = {{Multi-Zone Unit for Recurrent Neural Networks}},
author = {Meng, Fandong and Zhang, Jinchao and Liu, Yang and Zhou, Jie},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2020},
pages = {5150-5157},
doi = {10.1609/AAAI.V34I04.5958},
url = {https://mlanthology.org/aaai/2020/meng2020aaai-multi/}
}