Minkowski-R Back-Propagation: Learning in Connectionist Models with Non-Euclidian Error Signals

Abstract

Many connectionist learning models are implemented using a gradient descent in a least squares error function of the output and teacher signal. The present model Fneralizes. in particular. back-propagation [1] by using Minkowski-r power metrics. For small r's a "city-block" error metric is approximated and for large r's the "maximum" or "supremum" metric is approached. while for r=2 the standard back(cid:173) propagation model results. An implementation of Minkowski-r back-propagation is described. and several experiments are done which show that different values of r may be desirable for various purposes. Different r values may be appropriate for the reduction of the effects of outliers (noise). modeling the input space with more compact clusters. or modeling the statistics of a particular domain more naturally or in a way that may be more perceptually or psychologically meaningful (e.g. speech or vision).

Cite

Text

Hanson and Burr. "Minkowski-R Back-Propagation: Learning in Connectionist Models with Non-Euclidian Error Signals." Neural Information Processing Systems, 1987.

Markdown

[Hanson and Burr. "Minkowski-R Back-Propagation: Learning in Connectionist Models with Non-Euclidian Error Signals." Neural Information Processing Systems, 1987.](https://mlanthology.org/neurips/1987/hanson1987neurips-minkowskir/)

BibTeX

@inproceedings{hanson1987neurips-minkowskir,
  title     = {{Minkowski-R Back-Propagation: Learning in Connectionist Models with Non-Euclidian Error Signals}},
  author    = {Hanson, Stephen Jose and Burr, David J.},
  booktitle = {Neural Information Processing Systems},
  year      = {1987},
  pages     = {348-357},
  url       = {https://mlanthology.org/neurips/1987/hanson1987neurips-minkowskir/}
}