Efficient Algorithms for Non-Convex Isotonic Regression Through Submodular Optimization
Abstract
We consider the minimization of submodular functions subject to ordering constraints. We show that this potentially non-convex optimization problem can be cast as a convex optimization problem on a space of uni-dimensional measures, with ordering constraints corresponding to first-order stochastic dominance. We propose new discretization schemes that lead to simple and efficient algorithms based on zero-th, first, or higher order oracles; these algorithms also lead to improvements without isotonic constraints. Finally, our experiments show that non-convex loss functions can be much more robust to outliers for isotonic regression, while still being solvable in polynomial time.
Cite
Text
Bach. "Efficient Algorithms for Non-Convex Isotonic Regression Through Submodular Optimization." Neural Information Processing Systems, 2018.Markdown
[Bach. "Efficient Algorithms for Non-Convex Isotonic Regression Through Submodular Optimization." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/bach2018neurips-efficient/)BibTeX
@inproceedings{bach2018neurips-efficient,
title = {{Efficient Algorithms for Non-Convex Isotonic Regression Through Submodular Optimization}},
author = {Bach, Francis},
booktitle = {Neural Information Processing Systems},
year = {2018},
pages = {1-10},
url = {https://mlanthology.org/neurips/2018/bach2018neurips-efficient/}
}