Learning Rates for Kernel-Based Expectile Regression

Abstract

Conditional expectiles are becoming an increasingly important tool in finance as well as in other areas of applications. We analyse a support vector machine type approach for estimating conditional expectiles and establish learning rates that are minimax optimal modulo a logarithmic factor if Gaussian RBF kernels are used and the desired expectile is smooth in a Besov sense. As a special case, our learning rates improves the best known rates for kernel-based least squares regression in aforementioned scenario. Key ingredients of our statistical analysis are a general calibration inequality for the asymmetric least squares loss, a corresponding variance bound as well as an improved entropy number bound for Gaussian RBF kernels.

Cite

Text

Farooq and Steinwart. "Learning Rates for Kernel-Based Expectile Regression." Machine Learning, 2019. doi:10.1007/S10994-018-5762-9

Markdown

[Farooq and Steinwart. "Learning Rates for Kernel-Based Expectile Regression." Machine Learning, 2019.](https://mlanthology.org/mlj/2019/farooq2019mlj-learning/) doi:10.1007/S10994-018-5762-9

BibTeX

@article{farooq2019mlj-learning,
  title     = {{Learning Rates for Kernel-Based Expectile Regression}},
  author    = {Farooq, Muhammad and Steinwart, Ingo},
  journal   = {Machine Learning},
  year      = {2019},
  pages     = {203-227},
  doi       = {10.1007/S10994-018-5762-9},
  volume    = {108},
  url       = {https://mlanthology.org/mlj/2019/farooq2019mlj-learning/}
}