Conditions for Occam's Razor Applicability and Noise Elimination
Abstract
The Occam's razor principle suggests that among all the correct hypotheses, the simplest hypothesis is the one which best captures the structure of the problem domain and has the highest prediction accuracy when classifying new instances. This principle is implicitly used also for dealing with noise, in order to avoid overfitting a noisy training set by rule truncation or by pruning of decision trees. This work gives a theoretical framework for the applicability of Occam's razor, developed into a procedure for eliminating noise from a training set. The results of empirical evaluation show the usefulness of the presented approach to noise elimination.
Cite
Text
Gamberger and Lavrac. "Conditions for Occam's Razor Applicability and Noise Elimination." European Conference on Machine Learning, 1997. doi:10.1007/3-540-62858-4_76Markdown
[Gamberger and Lavrac. "Conditions for Occam's Razor Applicability and Noise Elimination." European Conference on Machine Learning, 1997.](https://mlanthology.org/ecmlpkdd/1997/gamberger1997ecml-conditions/) doi:10.1007/3-540-62858-4_76BibTeX
@inproceedings{gamberger1997ecml-conditions,
title = {{Conditions for Occam's Razor Applicability and Noise Elimination}},
author = {Gamberger, Dragan and Lavrac, Nada},
booktitle = {European Conference on Machine Learning},
year = {1997},
pages = {108-123},
doi = {10.1007/3-540-62858-4_76},
url = {https://mlanthology.org/ecmlpkdd/1997/gamberger1997ecml-conditions/}
}