TY - JOUR
AB - Models of neural responses to stimuli with complex spatiotemporal correlation structure often assume that neurons are selective for only a small number of linear projections of a potentially high-dimensional input. In this review, we explore recent modeling approaches where the neural response depends on the quadratic form of the input rather than on its linear projection, that is, the neuron is sensitive to the local covariance structure of the signal preceding the spike. To infer this quadratic dependence in the presence of arbitrary (e.g., naturalistic) stimulus distribution, we review several inference methods, focusing in particular on two information theory–based approaches (maximization of stimulus energy and of noise entropy) and two likelihood-based approaches (Bayesian spike-triggered covariance and extensions of generalized linear models). We analyze the formal relationship between the likelihood-based and information-based approaches to demonstrate how they lead to consistent inference. We demonstrate the practical feasibility of these procedures by using model neurons responding to a flickering variance stimulus.
AU - Rajan, Kanaka
AU - Marre, Olivier
AU - Tkacik, Gasper
ID - 2818
IS - 7
JF - Neural Computation
TI - Learning quadratic receptive fields from neural responses to natural stimuli
VL - 25
ER -
TY - JOUR
AB - We consider a two-parameter family of piecewise linear maps in which the moduli of the two slopes take different values. We provide numerical evidence of the existence of some parameter regions in which the Lyapunov exponent and the topological entropy remain constant. Analytical proof of this phenomenon is also given for certain cases. Surprisingly however, the systems with that property are not conjugate as we prove by using kneading theory.
AU - Botella Soler, Vicente
AU - Oteo, José
AU - Ros, Javier
AU - Glendinning, Paul
ID - 2861
IS - 12
JF - Journal of Physics A: Mathematical and Theoretical
TI - Lyapunov exponent and topological entropy plateaus in piecewise linear maps
VL - 46
ER -
TY - JOUR
AB - Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME) model—a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population.
AU - Granot Atedgi, Einat
AU - Tkacik, Gasper
AU - Segev, Ronen
AU - Schneidman, Elad
ID - 2863
IS - 3
JF - PLoS Computational Biology
TI - Stimulus-dependent maximum entropy models of neural population codes
VL - 9
ER -
TY - JOUR
AB - The ability of an organism to distinguish between various stimuli is limited by the structure and noise in the population code of its sensory neurons. Here we infer a distance measure on the stimulus space directly from the recorded activity of 100 neurons in the salamander retina. In contrast to previously used measures of stimulus similarity, this "neural metric" tells us how distinguishable a pair of stimulus clips is to the retina, based on the similarity between the induced distributions of population responses. We show that the retinal distance strongly deviates from Euclidean, or any static metric, yet has a simple structure: we identify the stimulus features that the neural population is jointly sensitive to, and show the support-vector-machine- like kernel function relating the stimulus and neural response spaces. We show that the non-Euclidean nature of the retinal distance has important consequences for neural decoding.
AU - Tkacik, Gasper
AU - Granot Atedgi, Einat
AU - Segev, Ronen
AU - Schneidman, Elad
ID - 2913
IS - 5
JF - Physical Review Letters
TI - Retinal metric: a stimulus distance measure derived from population neural responses
VL - 110
ER -
TY - JOUR
AB - The scale invariance of natural images suggests an analogy to the statistical mechanics of physical systems at a critical point. Here we examine the distribution of pixels in small image patches and show how to construct the corresponding thermodynamics. We find evidence for criticality in a diverging specific heat, which corresponds to large fluctuations in how "surprising" we find individual images, and in the quantitative form of the entropy vs energy. We identify special image configurations as local energy minima and show that average patches within each basin are interpretable as lines and edges in all orientations.
AU - Stephens, Greg
AU - Mora, Thierry
AU - Tkacik, Gasper
AU - Bialek, William
ID - 2914
IS - 1
JF - Physical Review Letters
TI - Statistical thermodynamics of natural images
VL - 110
ER -