On Different Facets of Regularization Theory
Abstract
This review provides a comprehensive understanding of regularization theory from different perspectives, emphasizing smoothness and simplicity principles. Using the tools of operator theory and Fourier analysis, it is shown that the solution of the classical Tikhonov regularization problem can be derived from the regularized functional defined by a linear differential (integral) operator in the spatial (Fourier) domain. State-ofthe-art research relevant to the regularization theory is reviewed, covering Occam's razor, minimum length description, Bayesian theory, pruning algorithms, informational (entropy) theory, statistical learning theory, and equivalent regularization. The universal principle of regularization in terms of Kolmogorov complexity is discussed. Finally, some prospective studies on regularization theory and beyond are suggested.
Cite
Text
Chen and Haykin. "On Different Facets of Regularization Theory." Neural Computation, 2002. doi:10.1162/089976602760805296Markdown
[Chen and Haykin. "On Different Facets of Regularization Theory." Neural Computation, 2002.](https://mlanthology.org/neco/2002/chen2002neco-different/) doi:10.1162/089976602760805296BibTeX
@article{chen2002neco-different,
title = {{On Different Facets of Regularization Theory}},
author = {Chen, Zhe and Haykin, Simon},
journal = {Neural Computation},
year = {2002},
pages = {2791-2846},
doi = {10.1162/089976602760805296},
volume = {14},
url = {https://mlanthology.org/neco/2002/chen2002neco-different/}
}