Some Approximation Properties of Projection Pursuit Learning Networks

Abstract

This paper will address an important question in machine learning: What kind of network architectures work better on what kind of problems? A projection pursuit learning network has a very similar structure to a one hidden layer sigmoidal neural network. A general method based on a continuous version of projection pursuit regression is developed to show that projection pursuit regression works better on angular smooth func(cid:173) tions than on Laplacian smooth functions. There exists a ridge function approximation scheme to avoid the curse of dimensionality for approxi(cid:173) mating functions in L2(¢d).

Cite

Text

Zhao and Atkeson. "Some Approximation Properties of Projection Pursuit Learning Networks." Neural Information Processing Systems, 1991.

Markdown

[Zhao and Atkeson. "Some Approximation Properties of Projection Pursuit Learning Networks." Neural Information Processing Systems, 1991.](https://mlanthology.org/neurips/1991/zhao1991neurips-some/)

BibTeX

@inproceedings{zhao1991neurips-some,
  title     = {{Some Approximation Properties of Projection Pursuit Learning Networks}},
  author    = {Zhao, Ying and Atkeson, Christopher G.},
  booktitle = {Neural Information Processing Systems},
  year      = {1991},
  pages     = {936-943},
  url       = {https://mlanthology.org/neurips/1991/zhao1991neurips-some/}
}