Optimal Neural Tuning Curves for Arbitrary Stimulus Distributions: Discrimax, Infomax and Minimum $L_p$ Loss
Abstract
In this work we study how the stimulus distribution influences the optimal coding of an individual neuron. Closed-form solutions to the optimal sigmoidal tuning curve are provided for a neuron obeying Poisson statistics under a given stimulus distribution. We consider a variety of optimality criteria, including maximizing discriminability, maximizing mutual information and minimizing estimation error under a general $L_p$ norm. We generalize the Cramer-Rao lower bound and show how the $L_p$ loss can be written as a functional of the Fisher Information in the asymptotic limit, by proving the moment convergence of certain functions of Poisson random variables. In this manner, we show how the optimal tuning curve depends upon the loss function, and the equivalence of maximizing mutual information with minimizing $L_p$ loss in the limit as $p$ goes to zero.
Cite
Text
Wang et al. "Optimal Neural Tuning Curves for Arbitrary Stimulus Distributions: Discrimax, Infomax and Minimum $L_p$ Loss." Neural Information Processing Systems, 2012.Markdown
[Wang et al. "Optimal Neural Tuning Curves for Arbitrary Stimulus Distributions: Discrimax, Infomax and Minimum $L_p$ Loss." Neural Information Processing Systems, 2012.](https://mlanthology.org/neurips/2012/wang2012neurips-optimal/)BibTeX
@inproceedings{wang2012neurips-optimal,
title = {{Optimal Neural Tuning Curves for Arbitrary Stimulus Distributions: Discrimax, Infomax and Minimum $L_p$ Loss}},
author = {Wang, Zhuo and Stocker, Alan and Lee, Daniel D},
booktitle = {Neural Information Processing Systems},
year = {2012},
pages = {2168-2176},
url = {https://mlanthology.org/neurips/2012/wang2012neurips-optimal/}
}