Linearly Constrained Gaussian Processes Are SkewGPs: Application to Monotonic Preference Learning and Desirability

Abstract

We show that existing approaches to Linearly Constrained Gaussian Processes (LCGP) for regression, based on imposing constraints on a finite set of operational points, can be seen as Skew Gaussian Processes (SkewGPs). In particular, focusing on inequality constraints and building upon a recent unification of regression, classification, and preference learning through SkewGPs, we extend LCGP to handle monotonic preference learning and desirability, crucial for understanding and predicting human decision making. We demonstrate the efficacy of the proposed model on simulated and real data.

Cite

Text

Benavoli and Azzimonti. "Linearly Constrained Gaussian Processes Are SkewGPs: Application to Monotonic Preference Learning and Desirability." Uncertainty in Artificial Intelligence, 2024.

Markdown

[Benavoli and Azzimonti. "Linearly Constrained Gaussian Processes Are SkewGPs: Application to Monotonic Preference Learning and Desirability." Uncertainty in Artificial Intelligence, 2024.](https://mlanthology.org/uai/2024/benavoli2024uai-linearly/)

BibTeX

@inproceedings{benavoli2024uai-linearly,
  title     = {{Linearly Constrained Gaussian Processes Are SkewGPs: Application to Monotonic Preference Learning and Desirability}},
  author    = {Benavoli, Alessio and Azzimonti, Dario},
  booktitle = {Uncertainty in Artificial Intelligence},
  year      = {2024},
  pages     = {333-348},
  volume    = {244},
  url       = {https://mlanthology.org/uai/2024/benavoli2024uai-linearly/}
}