No Free Lunch for Noise Prediction

Abstract

No-free-lunch theorems have shown that learning algorithms cannot be universally good. We show that no free funch exists for noise prediction as well. We show that when the noise is additive and the prior over target functions is uniform, a prior on the noise distribution cannot be updated, in the Bayesian sense, from any finite data set. We emphasize the importance of a prior over the target function in order to justify superior performance for learning systems.

Cite

Text

Magdon-Ismail. "No Free Lunch for Noise Prediction." Neural Computation, 2000. doi:10.1162/089976600300015709

Markdown

[Magdon-Ismail. "No Free Lunch for Noise Prediction." Neural Computation, 2000.](https://mlanthology.org/neco/2000/magdonismail2000neco-free/) doi:10.1162/089976600300015709

BibTeX

@article{magdonismail2000neco-free,
  title     = {{No Free Lunch for Noise Prediction}},
  author    = {Magdon-Ismail, Malik},
  journal   = {Neural Computation},
  year      = {2000},
  pages     = {547-564},
  doi       = {10.1162/089976600300015709},
  volume    = {12},
  url       = {https://mlanthology.org/neco/2000/magdonismail2000neco-free/}
}