Fixing Mini-Batch Sequences with Hierarchical Robust Partitioning
Abstract
We propose a general and efficient hierarchical robust partitioning framework to generate a deterministic sequence of mini-batches, one that offers assurances of being high quality, unlike a randomly drawn sequence. We compare our deterministically generated mini-batch sequences to randomly generated sequences; we show that, on a variety of deep learning tasks, the deterministic sequences significantly beat the mean and worst case performance of the random sequences, and often outperforms the best of the random sequences. Our theoretical contributions include a new algorithm for the robust submodular partition problem subject to cardinality constraints (which is used to construct mini-batch sequences), and show in general that the algorithm is fast and has good theoretical guarantees; we also show a more efficient hierarchical variant of the algorithm with similar guarantees under mild assumptions.
Cite
Text
Wang et al. "Fixing Mini-Batch Sequences with Hierarchical Robust Partitioning." Artificial Intelligence and Statistics, 2019.Markdown
[Wang et al. "Fixing Mini-Batch Sequences with Hierarchical Robust Partitioning." Artificial Intelligence and Statistics, 2019.](https://mlanthology.org/aistats/2019/wang2019aistats-fixing/)BibTeX
@inproceedings{wang2019aistats-fixing,
title = {{Fixing Mini-Batch Sequences with Hierarchical Robust Partitioning}},
author = {Wang, Shengjie and Bai, Wenruo and Lavania, Chandrashekhar and Bilmes, Jeff},
booktitle = {Artificial Intelligence and Statistics},
year = {2019},
pages = {3352-3361},
volume = {89},
url = {https://mlanthology.org/aistats/2019/wang2019aistats-fixing/}
}