On the Convergence of a Family of Robust Losses for Stochastic Gradient Descent

Abstract

The convergence of Stochastic Gradient Descent SGD using convex loss functions has been widely studied. However, vanilla SGD methods using convex losses cannot perform well with noisy labels, which adversely affect the update of the primal variable in SGD methods. Unfortunately, noisy labels are ubiquitous in real world applications such as crowdsourcing. To handle noisy labels, in this paper, we present a family of robust losses for SGD methods. By employing our robust losses, SGD methods successfully reduce negative effects caused by noisy labels on each update of the primal variable. We not only reveal the convergence rate of SGD methods using robust losses, but also provide the robustness analysis on two representative robust losses. Comprehensive experimental results on six real-world datasets show that SGD methods using robust losses are obviously more robust than other baseline methods in most situations with fast convergence.

Cite

Text

Han et al. "On the Convergence of a Family of Robust Losses for Stochastic Gradient Descent." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2016. doi:10.1007/978-3-319-46128-1_42

Markdown

[Han et al. "On the Convergence of a Family of Robust Losses for Stochastic Gradient Descent." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2016.](https://mlanthology.org/ecmlpkdd/2016/han2016ecmlpkdd-convergence/) doi:10.1007/978-3-319-46128-1_42

BibTeX

@inproceedings{han2016ecmlpkdd-convergence,
  title     = {{On the Convergence of a Family of Robust Losses for Stochastic Gradient Descent}},
  author    = {Han, Bo and Tsang, Ivor W. and Chen, Ling},
  booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
  year      = {2016},
  pages     = {665-680},
  doi       = {10.1007/978-3-319-46128-1_42},
  url       = {https://mlanthology.org/ecmlpkdd/2016/han2016ecmlpkdd-convergence/}
}