The Isotron Algorithm: High-Dimensional Isotonic Regression

Abstract

The Perceptron algorithm elegantly solves binary classification problems that have a margin between positive and negative examples. Isotonic regression (fitting an arbitrary increasing function in one dimension) is also a natural problem with a simple solution. By combining the two, we get a new but very simple algorithm with strong guarantees. Our ISOTRON algorithm provably learns Single Index Models (SIM), a generalization of linear and logistic regression, generalized linear models, as well as binary classification by linear threshold functions. In particular, it provably learns SIMs with unknown mean functions that are nondecreasing and Lipschitz-continuous, thereby generalizing linear and logistic regression and linear-threshold functions (with a margin). Like the Perceptron, it is straightforward to implement and kernelize. Hence, the ISOTRON provides a very simple yet flexible and principled approach to regression.

Cite

Text

Kalai and Sastry. "The Isotron Algorithm: High-Dimensional Isotonic Regression." Annual Conference on Computational Learning Theory, 2009.

Markdown

[Kalai and Sastry. "The Isotron Algorithm: High-Dimensional Isotonic Regression." Annual Conference on Computational Learning Theory, 2009.](https://mlanthology.org/colt/2009/kalai2009colt-isotron/)

BibTeX

@inproceedings{kalai2009colt-isotron,
  title     = {{The Isotron Algorithm: High-Dimensional Isotonic Regression}},
  author    = {Kalai, Adam Tauman and Sastry, Ravi},
  booktitle = {Annual Conference on Computational Learning Theory},
  year      = {2009},
  url       = {https://mlanthology.org/colt/2009/kalai2009colt-isotron/}
}