Optimization of the Kernel Functions in a Probabilistic Neural Network Analyzing the Local Pattern Distribution

Abstract

This article proposes a procedure for the automatic determination of the elements of the covariance matrix of the gaussian kernel function of probabilistic neural networks. Two matrices, a rotation matrix and a matrix of variances, can be calculated by analyzing the local environment of each training pattern. The combination of them will form the covariance matrix of each training pattern. This automation has two advantages: First, it will free the neural network designer from indicating the complete covariance matrix, and second, it will result in a network with better generalization ability than the original model. A variation of the famous two-spiral problem and real-world examples from the UCI Machine Learning Repository will show a classification rate not only better than the original probabilistic neural network but also that this model can outperform other well-known classification techniques.

Cite

Text

Galleske and Castellanos. "Optimization of the Kernel Functions in a Probabilistic Neural Network Analyzing the Local Pattern Distribution." Neural Computation, 2002. doi:10.1162/089976602753633448

Markdown

[Galleske and Castellanos. "Optimization of the Kernel Functions in a Probabilistic Neural Network Analyzing the Local Pattern Distribution." Neural Computation, 2002.](https://mlanthology.org/neco/2002/galleske2002neco-optimization/) doi:10.1162/089976602753633448

BibTeX

@article{galleske2002neco-optimization,
  title     = {{Optimization of the Kernel Functions in a Probabilistic Neural Network Analyzing the Local Pattern Distribution}},
  author    = {Galleske, Ingo and Castellanos, Juan},
  journal   = {Neural Computation},
  year      = {2002},
  pages     = {1183-1194},
  doi       = {10.1162/089976602753633448},
  volume    = {14},
  url       = {https://mlanthology.org/neco/2002/galleske2002neco-optimization/}
}