The Impossibility of Parallelizing Boosting
Abstract
The aim of boosting is to convert a sequence of weak learners into a strong learner. At their heart, these methods are fully sequential. In this paper, we investigate the possibility of parallelizing boosting. Our main contribution is a strong negative result, implying that significant parallelization of boosting requires an exponential blow-up in the total computing resources needed for training.
Cite
Text
Karbasi and Green Larsen. "The Impossibility of Parallelizing Boosting." Proceedings of The 35th International Conference on Algorithmic Learning Theory, 2024.Markdown
[Karbasi and Green Larsen. "The Impossibility of Parallelizing Boosting." Proceedings of The 35th International Conference on Algorithmic Learning Theory, 2024.](https://mlanthology.org/alt/2024/karbasi2024alt-impossibility/)BibTeX
@inproceedings{karbasi2024alt-impossibility,
title = {{The Impossibility of Parallelizing Boosting}},
author = {Karbasi, Amin and Green Larsen, Kasper},
booktitle = {Proceedings of The 35th International Conference on Algorithmic Learning Theory},
year = {2024},
pages = {635-653},
volume = {237},
url = {https://mlanthology.org/alt/2024/karbasi2024alt-impossibility/}
}