SGLB: Stochastic Gradient Langevin Boosting
Abstract
This paper introduces Stochastic Gradient Langevin Boosting (SGLB) - a powerful and efficient machine learning framework that may deal with a wide range of loss functions and has provable generalization guarantees. The method is based on a special form of the Langevin diffusion equation specifically designed for gradient boosting. This allows us to theoretically guarantee the global convergence even for multimodal loss functions, while standard gradient boosting algorithms can guarantee only local optimum. We also empirically show that SGLB outperforms classic gradient boosting when applied to classification tasks with 0-1 loss function, which is known to be multimodal.
Cite
Text
Ustimenko and Prokhorenkova. "SGLB: Stochastic Gradient Langevin Boosting." International Conference on Machine Learning, 2021.Markdown
[Ustimenko and Prokhorenkova. "SGLB: Stochastic Gradient Langevin Boosting." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/ustimenko2021icml-sglb/)BibTeX
@inproceedings{ustimenko2021icml-sglb,
title = {{SGLB: Stochastic Gradient Langevin Boosting}},
author = {Ustimenko, Aleksei and Prokhorenkova, Liudmila},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {10487-10496},
volume = {139},
url = {https://mlanthology.org/icml/2021/ustimenko2021icml-sglb/}
}