Prediction and decoding of retinal spike responses
with a probabilistic spiking model
Journal
of Neuroscience, 25(47): 11003-11013 (2005).
Sensory encoding in spiking neurons depends on both the spatiotemporal
integration of sensory inputs and the intrinsic mechanisms governing
the dynamics and variability of spike generation. We show that the
stimulus selectivity, reliability, and timing precision of primate
retinal ganglion cell (RGC) light responses can be reproduced
accurately with a simple model consisting of a leaky
integrate-and-fire spike generator driven by a linearly filtered
stimulus, a post-spike current, and a Gaussian noise current. We fit
model parameters for individual RGCs by maximizing the likelihood of
observed spike responses to a stochastic visual stimulus. Though
compact, the fitted model predicts the detailed time structure of
responses to novel stimuli, accurately capturing the interaction
between spiking history and the encoding of the sensory stimulus. The
model also accounts for the variability in responses to repeated
stimuli, even when fit to data from a single (non-repeating) stimulus
sequence. Finally, the model can be used to derive an explicit,
maximum-likelihood decoding rule for neural spike trains, thus
providing a tool for assessing the limitations that spiking
variability imposes on sensory performance.
Reprint (pdf, 1.2M)
| Liam
Paninski's research
| Related work on
estimation of neural models