MODL: A Bayes Optimal Discretization Method for Continuous Attributes

Abstract

While real data often comes in mixed format, discrete and continuous, many supervised induction algorithms require discrete data. Efficient discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. In this paper, we propose a new discretization method MODL^1, founded on a Bayesian approach. We introduce a space of discretization models and a prior distribution defined on this model space. This results in the definition of a Bayes optimal evaluation criterion of discretizations. We then propose a new super-linear optimization algorithm that manages to find near-optimal discretizations. Extensive comparative experiments both on real and synthetic data demonstrate the high inductive performances obtained by the new discretization method.

Cite

Text

Boullé. "MODL: A Bayes Optimal Discretization Method for Continuous Attributes." Machine Learning, 2006. doi:10.1007/S10994-006-8364-X

Markdown

[Boullé. "MODL: A Bayes Optimal Discretization Method for Continuous Attributes." Machine Learning, 2006.](https://mlanthology.org/mlj/2006/boulle2006mlj-modl/) doi:10.1007/S10994-006-8364-X

BibTeX

@article{boulle2006mlj-modl,
  title     = {{MODL: A Bayes Optimal Discretization Method for Continuous Attributes}},
  author    = {Boullé, Marc},
  journal   = {Machine Learning},
  year      = {2006},
  pages     = {131-165},
  doi       = {10.1007/S10994-006-8364-X},
  volume    = {65},
  url       = {https://mlanthology.org/mlj/2006/boulle2006mlj-modl/}
}