Long Short-Term Memory over Recursive Structures
Abstract
The chain-structured long short-term memory (LSTM) has showed to be effective in a wide range of problems such as speech recognition and machine translation. In this paper, we propose to extend it to tree structures, in which a memory cell can reflect the history memories of multiple child cells or multiple descendant cells in a recursive process. We call the model S-LSTM, which provides a principled way of considering long-distance interaction over hierarchies, e.g., language or image parse structures. We leverage the models for semantic composition to understand the meaning of text, a fundamental problem in natural language understanding, and show that it outperforms a state-of-the-art recursive model by replacing its composition layers with the S-LSTM memory blocks. We also show that utilizing the given structures is helpful in achieving a performance better than that without considering the structures.
Cite
Text
Zhu et al. "Long Short-Term Memory over Recursive Structures." International Conference on Machine Learning, 2015.Markdown
[Zhu et al. "Long Short-Term Memory over Recursive Structures." International Conference on Machine Learning, 2015.](https://mlanthology.org/icml/2015/zhu2015icml-long/)BibTeX
@inproceedings{zhu2015icml-long,
title = {{Long Short-Term Memory over Recursive Structures}},
author = {Zhu, Xiaodan and Sobihani, Parinaz and Guo, Hongyu},
booktitle = {International Conference on Machine Learning},
year = {2015},
pages = {1604-1612},
volume = {37},
url = {https://mlanthology.org/icml/2015/zhu2015icml-long/}
}