Parameterized Complexity of Small Decision Tree Learning

Abstract

We study the NP-hard problem of learning a decision tree (DT) of smallest depth or size from data. We provide the first parameterized complexity analysis of the problem and draw a detailed parameterized complexity map for the natural parameters: size or depth of the DT, maximum domain size of all features, and the maximum Hamming distance between any two examples. Our main result shows that learning DTs of smallest depth or size is fixed-parameter tractable (FPT) parameterized by the combination of all three of these parameters. We contrast this FPT-result by various hardness results that underline the algorithmic significance of the considered parameters.

Cite

Text

Ordyniak and Szeider. "Parameterized Complexity of Small Decision Tree Learning." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I7.16800

Markdown

[Ordyniak and Szeider. "Parameterized Complexity of Small Decision Tree Learning." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/ordyniak2021aaai-parameterized/) doi:10.1609/AAAI.V35I7.16800

BibTeX

@inproceedings{ordyniak2021aaai-parameterized,
  title     = {{Parameterized Complexity of Small Decision Tree Learning}},
  author    = {Ordyniak, Sebastian and Szeider, Stefan},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2021},
  pages     = {6454-6462},
  doi       = {10.1609/AAAI.V35I7.16800},
  url       = {https://mlanthology.org/aaai/2021/ordyniak2021aaai-parameterized/}
}