Lazy Decision Trees

Abstract

Lazy learning algorithms, exemplied by nearest-neighbor algorithms, do not induce a concise hypoth-esis from a given training set; the inductive process is delayed until a test instance is given. Algorithms for constructing decision trees, such as C4.5, ID3, and CART create a single \\best " decision tree during the training phase, and this tree is then used to classify test instances. The tests at the nodes of the con-structed tree are good on average, but there may be better tests for classifying a specic instance. We pro-pose a lazy decision tree algorithm|LazyDT|that conceptually constructs the \\best " decision tree for each test instance. In practice, only a path needs to be constructed, and a caching scheme makes the al-gorithm fast. The algorithm is robust with respect to missing values without resorting to the complicated methods usually seen in induction of decision trees. Experiments on real and articial problems are pre-sented.

Cite

Text

Friedman et al. "Lazy Decision Trees." AAAI Conference on Artificial Intelligence, 1996.

Markdown

[Friedman et al. "Lazy Decision Trees." AAAI Conference on Artificial Intelligence, 1996.](https://mlanthology.org/aaai/1996/friedman1996aaai-lazy/)

BibTeX

@inproceedings{friedman1996aaai-lazy,
  title     = {{Lazy Decision Trees}},
  author    = {Friedman, Jerome H. and Kohavi, Ron and Yun, Yeogirl},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {1996},
  pages     = {717-724},
  url       = {https://mlanthology.org/aaai/1996/friedman1996aaai-lazy/}
}