Dynamic Evaluation of Neural Sequence Models
Abstract
We explore dynamic evaluation, where sequence models are adapted to the recent sequence history using gradient descent, assigning higher probabilities to re-occurring sequential patterns. We develop a dynamic evaluation approach that outperforms existing adaptation approaches in our comparisons. We apply dynamic evaluation to outperform all previous word-level perplexities on the Penn Treebank and WikiText-2 datasets (achieving 51.1 and 44.3 respectively) and all previous character-level cross-entropies on the text8 and Hutter Prize datasets (achieving 1.19 bits/char and 1.08 bits/char respectively).
Cite
Text
Krause et al. "Dynamic Evaluation of Neural Sequence Models." International Conference on Machine Learning, 2018.Markdown
[Krause et al. "Dynamic Evaluation of Neural Sequence Models." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/krause2018icml-dynamic/)BibTeX
@inproceedings{krause2018icml-dynamic,
title = {{Dynamic Evaluation of Neural Sequence Models}},
author = {Krause, Ben and Kahembwe, Emmanuel and Murray, Iain and Renals, Steve},
booktitle = {International Conference on Machine Learning},
year = {2018},
pages = {2766-2775},
volume = {80},
url = {https://mlanthology.org/icml/2018/krause2018icml-dynamic/}
}