Comparison of objective functions for estimating linear-nonlinear models

Autor: Sharpee, Tatyana O.
Rok vydání: 2008
Předmět:
Druh dokumentu: Working Paper
Popis: This paper compares a family of methods for characterizing neural feature selectivity with natural stimuli in the framework of the linear-nonlinear model. In this model, the neural firing rate is a nonlinear function of a small number of relevant stimulus components. The relevant stimulus dimensions can be found by maximizing one of the family of objective functions, Renyi divergences of different orders. We show that maximizing one of them, Renyi divergence of order 2, is equivalent to least-square fitting of the linear-nonlinear model to neural data. Next, we derive reconstruction errors in relevant dimensions found by maximizing Renyi divergences of arbitrary order in the asymptotic limit of large spike numbers. We find that the smallest rrors are obtained with Renyi divergence of order 1, also known as Kullback-Leibler divergence. This corresponds to finding relevant dimensions by maximizing mutual information. We numerically test how these optimization schemes perform in the regime of low signal-to-noise ratio (small number of spikes and increasing neural noise) for model visual neurons. We find that optimization schemes based on either least square fitting or information maximization perform well even when number of spikes is small. Information maximization provides slightly, but significantly, better reconstructions than least square fitting. This makes the problem of finding relevant dimensions, together with the problem of lossy compression, one of examples where information-theoretic measures are no more data limited than those derived from least squares.
Comment: to appear in Advances in Neural Information Processing Systems 21 (NIPS, 2007)
Databáze: arXiv