Deep Non-Crossing Quantiles Through the Partial Derivative

Abstract

Quantile Regression (QR) provides a way to approximate a single conditional quantile. To have a more informative description of the conditional distribution, QR can be merged with deep learning techniques to simultaneously estimate multiple quantiles. However, the minimisation of the QR-loss function does not guarantee non-crossing quantiles, which affects the validity of such predictions and introduces a critical issue in certain scenarios. In this article, we propose a generic deep learning algorithm for predicting an arbitrary number of quantiles that ensures the quantile monotonicity constraint up to the machine precision and maintains its modelling performance with respect to alternative models. The presented method is evaluated over several real-world datasets obtaining state-of-the-art results as well as showing that it scales to large-size data sets.

Cite

Text

Brando et al. "Deep Non-Crossing Quantiles Through the Partial Derivative." Artificial Intelligence and Statistics, 2022.

Markdown

[Brando et al. "Deep Non-Crossing Quantiles Through the Partial Derivative." Artificial Intelligence and Statistics, 2022.](https://mlanthology.org/aistats/2022/brando2022aistats-deep/)

BibTeX

@inproceedings{brando2022aistats-deep,
  title     = {{Deep Non-Crossing Quantiles Through the Partial Derivative}},
  author    = {Brando, Axel and Center, Barcelona Supercomputing and Joan Gimeno,  and Rodriguez-Serrano, Jose and Vitria, Jordi},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2022},
  pages     = {7902-7914},
  volume    = {151},
  url       = {https://mlanthology.org/aistats/2022/brando2022aistats-deep/}
}