Approximation Algorithms for Minimizing Empirical Error by Axis-Parallel Hyperplanes

Abstract

Many learning situations involve separation of labeled training instances by hyperplanes. Consistent separation is of theoretical interest, but the real goal is rather to minimize the number of errors using a bounded number of hyperplanes. Exact minimization of empirical error in a high-dimensional grid induced into the feature space by axis-parallel hyperplanes is NP-hard. We develop two approximation schemes with performance guarantees, a greedy set covering scheme for producing a consistently labeled grid, and integer programming rounding scheme for finding the minimum error grid with bounded number of hyperplanes.

Cite

Text

Elomaa et al. "Approximation Algorithms for Minimizing Empirical Error by Axis-Parallel Hyperplanes." European Conference on Machine Learning, 2005. doi:10.1007/11564096_53

Markdown

[Elomaa et al. "Approximation Algorithms for Minimizing Empirical Error by Axis-Parallel Hyperplanes." European Conference on Machine Learning, 2005.](https://mlanthology.org/ecmlpkdd/2005/elomaa2005ecml-approximation/) doi:10.1007/11564096_53

BibTeX

@inproceedings{elomaa2005ecml-approximation,
  title     = {{Approximation Algorithms for Minimizing Empirical Error by Axis-Parallel Hyperplanes}},
  author    = {Elomaa, Tapio and Kujala, Jussi and Rousu, Juho},
  booktitle = {European Conference on Machine Learning},
  year      = {2005},
  pages     = {547-555},
  doi       = {10.1007/11564096_53},
  url       = {https://mlanthology.org/ecmlpkdd/2005/elomaa2005ecml-approximation/}
}