Decomposing Isotonic Regression for Efficiently Solving Large Problems

Abstract

A new algorithm for isotonic regression is presented based on recursively partitioning the solution space. We develop efficient methods for each partitioning subproblem through an equivalent representation as a network flow problem, and prove that this sequence of partitions converges to the global solution. These network flow problems can further be decomposed in order to solve very large problems. Success of isotonic regression in prediction and our algorithm's favorable computational properties are demonstrated through simulated examples as large as 2x10^5 variables and 10^7 constraints.

Cite

Text

Luss et al. "Decomposing Isotonic Regression for Efficiently Solving Large Problems." Neural Information Processing Systems, 2010.

Markdown

[Luss et al. "Decomposing Isotonic Regression for Efficiently Solving Large Problems." Neural Information Processing Systems, 2010.](https://mlanthology.org/neurips/2010/luss2010neurips-decomposing/)

BibTeX

@inproceedings{luss2010neurips-decomposing,
  title     = {{Decomposing Isotonic Regression for Efficiently Solving Large Problems}},
  author    = {Luss, Ronny and Rosset, Saharon and Shahar, Moni},
  booktitle = {Neural Information Processing Systems},
  year      = {2010},
  pages     = {1513-1521},
  url       = {https://mlanthology.org/neurips/2010/luss2010neurips-decomposing/}
}