Generalized Hopfield Networks and Nonlinear Optimization
Abstract
A nonlinear neural framework, called the Generalized Hopfield network, is proposed, which is able to solve in a parallel distributed manner systems of nonlinear equations. The method is applied to the general nonlinear optimization problem. We demonstrate GHNs implementing the three most important optimization algorithms, namely the Augmented Lagrangian, Generalized Reduced Gradient and Successive Quadratic Programming methods. The study results in a dynamic view of the optimization problem and offers a straightforward model for the parallelization of the optimization computations, thus significantly extending the practical limits of problems that can be formulated as an optimization problem and which can gain from the introduction of nonlinearities in their structure (eg. pattern recognition, supervised learning, design of content-addressable memories).
Cite
Text
Reklaitis et al. "Generalized Hopfield Networks and Nonlinear Optimization." Neural Information Processing Systems, 1989.Markdown
[Reklaitis et al. "Generalized Hopfield Networks and Nonlinear Optimization." Neural Information Processing Systems, 1989.](https://mlanthology.org/neurips/1989/reklaitis1989neurips-generalized/)BibTeX
@inproceedings{reklaitis1989neurips-generalized,
title = {{Generalized Hopfield Networks and Nonlinear Optimization}},
author = {Reklaitis, Gintaras V. and Tsirukis, Athanasios G. and Tenorio, Manoel Fernando},
booktitle = {Neural Information Processing Systems},
year = {1989},
pages = {355-362},
url = {https://mlanthology.org/neurips/1989/reklaitis1989neurips-generalized/}
}