Multi-Observation Regression

Abstract

Given a data set of $(x,y)$ pairs, a common learning task is to fit a model predicting $y$ (a label or dependent variable) conditioned on $x$. This paper considers the similar but much less-understood problem of modeling “higher-order” statistics of $y$’s distribution conditioned on $x$. Such statistics are often challenging to estimate using traditional empirical risk minimization (ERM) approaches. We develop and theoretically analyze an ERM-like approach with multi-observation loss functions. We propose four algorithms formalizing the concept of ERM for this problem, two of which have statistical guarantees in settings allowing both slow and fast convergence rates, but which are out-performed empirically by the other two. Empirical results illustrate potential practicality of these algorithms in low dimensions and significant improvement over standard approaches in some settings.

Cite

Text

Frongillo et al. "Multi-Observation Regression." Artificial Intelligence and Statistics, 2019.

Markdown

[Frongillo et al. "Multi-Observation Regression." Artificial Intelligence and Statistics, 2019.](https://mlanthology.org/aistats/2019/frongillo2019aistats-multiobservation/)

BibTeX

@inproceedings{frongillo2019aistats-multiobservation,
  title     = {{Multi-Observation Regression}},
  author    = {Frongillo, Rafael and Mehta, Nishant A. and Morgan, Tom and Waggoner, Bo},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2019},
  pages     = {2691-2700},
  volume    = {89},
  url       = {https://mlanthology.org/aistats/2019/frongillo2019aistats-multiobservation/}
}