Representing Conditional Independence Using Decision Trees

Abstract

While the representation of decision trees is fully expressive theoretically, it has been observed that traditional decision trees has the replication problem. This problem makes decision trees to be large and learnable only when sufficient training data are available. In this paper, we present a new representation model, conditional independence trees (CITrees), to tackle the replication problem from probability perspective. We propose a novel algorithm for learning CITrees. Our experiments show that CITrees outperform naive Bayes (Langley,

Cite

Text

Su and Zhang. "Representing Conditional Independence Using Decision Trees." AAAI Conference on Artificial Intelligence, 2005.

Markdown

[Su and Zhang. "Representing Conditional Independence Using Decision Trees." AAAI Conference on Artificial Intelligence, 2005.](https://mlanthology.org/aaai/2005/su2005aaai-representing/)

BibTeX

@inproceedings{su2005aaai-representing,
  title     = {{Representing Conditional Independence Using Decision Trees}},
  author    = {Su, Jiang and Zhang, Harry},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2005},
  pages     = {874-879},
  url       = {https://mlanthology.org/aaai/2005/su2005aaai-representing/}
}