On the Efficient Classification of Data Structures by Neural Networks

Abstract

In the last few years it has been shown that recurrent neural networks are adequate for processing general data structures like trees and graphs, which opens the doors to a number of new interesting applications previously unexplored. In this paper, we analyze the efficiency of learning the membership of DOAGs (Directed Ordered Acyclic Graphs) in terms of local minima of the error surface by relying on the principle that their absence is a guarantee of efficient learning. We give sufficient conditions under which the error surface is local minima free. Specifically, we define a topological index associated with a collection of DOAGs that makes it possible to design the architecture so as to avoid local minima. 1 Introduction It is well-known that connectionist models are not only capable of dealing with static patterns, but also with sequential inputs. Real world, however, often proposes structured domains that can hardly be represented by simple sequences. For instance, there are cas...

Cite

Text

Frasconi et al. "On the Efficient Classification of Data Structures by Neural Networks." International Joint Conference on Artificial Intelligence, 1997.

Markdown

[Frasconi et al. "On the Efficient Classification of Data Structures by Neural Networks." International Joint Conference on Artificial Intelligence, 1997.](https://mlanthology.org/ijcai/1997/frasconi1997ijcai-efficient/)

BibTeX

@inproceedings{frasconi1997ijcai-efficient,
  title     = {{On the Efficient Classification of Data Structures by Neural Networks}},
  author    = {Frasconi, Paolo and Gori, Marco and Sperduti, Alessandro},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {1997},
  pages     = {1066-1071},
  url       = {https://mlanthology.org/ijcai/1997/frasconi1997ijcai-efficient/}
}