EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks
Abstract
We present EDA: easy data augmentation techniques for boosting performance on text classification tasks. EDA consists of four simple but powerful operations: synonym replacement, random insertion, random swap, and random deletion. On five text classification tasks, we show that EDA improves performance for both convolutional and recurrent neural networks. EDA demonstrates particularly strong results for smaller datasets; on average, across five datasets, training with EDA while using only 50% of the available training set achieved the same accuracy as normal training with all available data. We also performed extensive ablation studies and suggest parameters for practical use.
Cite
Text
Wei and Zou. "EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks." ICLR 2019 Workshops: LLD, 2019.Markdown
[Wei and Zou. "EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks." ICLR 2019 Workshops: LLD, 2019.](https://mlanthology.org/iclrw/2019/wei2019iclrw-eda/)BibTeX
@inproceedings{wei2019iclrw-eda,
title = {{EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks}},
author = {Wei, Jason and Zou, Kai},
booktitle = {ICLR 2019 Workshops: LLD},
year = {2019},
url = {https://mlanthology.org/iclrw/2019/wei2019iclrw-eda/}
}