A Recurrent Network Implementation of Time Series Classification
Abstract
An incremental credit assignment (ICRA) scheme is introduced and applied to time series classification. It has been inspired from Bayes' rule, but the Bayesian connection is not necessary either for its development or proof of its convergence properties. The ICRA scheme is implemented by a recurrent, hierarchical, modular neural network, which consists of a bank of predictive modules at the lower level, and a decision module at the higher level. For each predictive module, a credit function is computed; the module that best predicts the observed time series behavior receives highest credit. We prove that the credit functions converge (with probability one) to correct values. Simulation results are also presented.
Cite
Text
Petridis and Kehagias. "A Recurrent Network Implementation of Time Series Classification." Neural Computation, 1996. doi:10.1162/NECO.1996.8.2.357Markdown
[Petridis and Kehagias. "A Recurrent Network Implementation of Time Series Classification." Neural Computation, 1996.](https://mlanthology.org/neco/1996/petridis1996neco-recurrent/) doi:10.1162/NECO.1996.8.2.357BibTeX
@article{petridis1996neco-recurrent,
title = {{A Recurrent Network Implementation of Time Series Classification}},
author = {Petridis, Vassilios and Kehagias, Athanasios},
journal = {Neural Computation},
year = {1996},
pages = {357-372},
doi = {10.1162/NECO.1996.8.2.357},
volume = {8},
url = {https://mlanthology.org/neco/1996/petridis1996neco-recurrent/}
}