On the Complexity of Training a Single Perceptron with Programmable Synaptic Delays

Abstract

We consider a single perceptron $\mathcal{N}$ with synaptic delays which generalizes a simplified model for a spiking neuron where not only the time that a pulse needs to travel through a synapse is taken into account but also the input firing rates may have more different levels. A synchronization technique is introduced so that the results concerning the learnability of spiking neurons with binary delays also apply to $\mathcal{N}$ with arbitrary delays. In particular, the consistency problem for $\mathcal{N}$ with programmable delays and its approximation version prove to be NP-hard. It follows that the perceptrons with programmable synaptic delays are not properly PAC-learnable and the spiking neurons with arbitrary delays do not allow robust learning unless RP = NP . In addition, we show that the representation problem for $\mathcal{N}$ which is an issue whether an n -variable Boolean function given in DNF (or as a disjunction of O ( n ) threshold gates) can be computed by a spiking neuron is co-NP-hard.

Cite

Text

Síma. "On the Complexity of Training a Single Perceptron with Programmable Synaptic Delays." International Conference on Algorithmic Learning Theory, 2003. doi:10.1007/978-3-540-39624-6_18

Markdown

[Síma. "On the Complexity of Training a Single Perceptron with Programmable Synaptic Delays." International Conference on Algorithmic Learning Theory, 2003.](https://mlanthology.org/alt/2003/sima2003alt-complexity/) doi:10.1007/978-3-540-39624-6_18

BibTeX

@inproceedings{sima2003alt-complexity,
  title     = {{On the Complexity of Training a Single Perceptron with Programmable Synaptic Delays}},
  author    = {Síma, Jirí},
  booktitle = {International Conference on Algorithmic Learning Theory},
  year      = {2003},
  pages     = {221-233},
  doi       = {10.1007/978-3-540-39624-6_18},
  url       = {https://mlanthology.org/alt/2003/sima2003alt-complexity/}
}