Schätztheoretische Analyse neuronaler Codierungsstrategien
Stefan Wilke
ISBN 978-3-89722-778-1
111 Seiten, Erscheinungsjahr: 2001
Preis: 40.50 €
This dissertation deals with the application of estimation theory
to the analysis of neural codes.
Neural systems that represent stimulus information presumably
optimize their response characteristics by minimizing the error achievable
when reconstructing the stimulus from the activity.
By calculating this minimal reconstruction error either for theoretical
models of neural encoding systems or from empirically measured
activity, estimation theory serves to quantify the encoding
accuracy achievable by a given coding scheme.
Following an introduction of the basic concepts of classical
estimation theory and a discussion of the ideas behind its application
to neural coding, this thesis contains two worked-out examples
that tackle major problems inThis dissertation deals with the application of estimation theory
to the analysis of neural codes.
Neural systems that represent stimulus information presumably
optimize their response characteristics by minimizing the error achievable
when reconstructing the stimulus from the activity.
By calculating this minimal reconstruction error either for theoretical
models of neural encoding systems or from empirically measured
activity, estimation theory serves to quantify the encoding
accuracy achievable by a given coding scheme.
Following an introduction of the basic concepts of classical
estimation theory and a discussion of the ideas behind its application
to neural coding, this thesis contains two worked-out examples
that tackle major problems in the corresponding areas of
computational neuroscience.
The first application involves a Fisher information
analysis of the representational accuracy achieved by a population
of stochastically spiking neurons that encode a stimulus with
their spike counts during some fixed time interval.
The obtained results lead to three main conclusions.
First, the structure of neuronal noise can substantially modify the
encoding properties of neural systems.
In particular, the claim that limited-range correlations impose an
upper limit on this capacity was shown to be correct only for
the biologically implausible case of fixed-variance noise.
This shows that choosing the correct neuronal noise model
can be critical for theoretical analysis.
Second, considerations on parameter variability lead to the
hypothesis that the great variability observed empirically
may not simply be a byproduct of
neuronal diversity, but could be exploited by the neural system
to achieve better encoding performance.
Finally, it is demonstrated that
neural populations can choose from a wide variety of strategies to
optimize their tuning properties.
Hence, the question of optimal tuning properties may not be
reduced to a simple "broad or narrow"-dichotomy.
Second, a linear reconstruction approach (the Wiener-Kolmogorov filter)
was used to analyze coding strategies for time-varying stimuli.
In an application to motion representation in H1-neurons, it turned
out that, as shown above for static stimuli, the exact type of noise
(e.g. Poissonian, additive, or multiplicative)
is also important in the context of coding of dynamic stimuli.
Moreover, it was shown that biphasic filters allow the best reconstruction
if their time scale corresponds to the stimulus autocorrelation time,
while the performance of single-phase filters always improves with
decreasing time scale.
A second application of the Wiener-Kolmogorov filter to a more
sophisticated model of contrast coding
in retinal activity suggested that non-linear contrast
gain control does not improve the encoding of temporal contrast
patterns in the normal physiological regime of retinal function.
However, this example also demonstrated that the application of
estimation theory to complex, real-world biological systems
is not as straightforward as it may seem from purely theoretical studies.
In conclusion, this thesis demonstrates that estimation theory provides
a unified framework for the study of neural codes for both static and
time-varying stimuli.
In addition, it has successfully applied this framework to
aspects of neural coding and derived results of general importance
for our current picture of neural representation of stimulus information.
It is hoped that this work is of value for both experimental
and theoretical neuroscientists the corresponding areas of
computational neuroscience.
The first application involves a Fisher information
analysis of the representational accuracy achieved by a population
of stochastically spiking neurons that encode a stimulus with
their spike counts during some fixed time interval.
The obtained results lead to three main conclusions.
First, the structure of neuronal noise can substantially modify the
encoding properties of neural systems.
In particular, the claim that limited-range correlations impose an
upper limit on this capacity was shown to be correct only for
the biologically implausible case of fixed-variance noise.
This shows that choosing the correct neuronal noise model
can be critical for theoretical analysis.
Second, considerations on parameter variability lead to the
hypothesis that the great variability observed empirically
may not simply be a byproduct of
neuronal diversity, but could be exploited by the neural system
to achieve better encoding performance.
Finally, it is demonstrated that
neural populations can choose from a wide variety of strategies to
optimize their tuning properties.
Hence, the question of optimal tuning properties may not be
reduced to a simple "broad or narrow"-dichotomy.
Second, a linear reconstruction approach (the Wiener-Kolmogorov filter)
was used to analyze coding strategies for time-varying stimuli.
In an application to motion representation in H1-neurons, it turned
out that, as shown above for static stimuli, the exact type of noise
(e.g. Poissonian, additive, or multiplicative)
is also important in the context of coding of dynamic stimuli.
Moreover, it was shown that biphasic filters allow the best reconstruction
if their time scale corresponds to the stimulus autocorrelation time,
while the performance of single-phase filters always improves with
decreasing time scale.
A second application of the Wiener-Kolmogorov filter to a more
sophisticated model of contrast coding
in retinal activity suggested that non-linear contrast
gain control does not improve the encoding of temporal contrast
patterns in the normal physiological regime of retinal function.
However, this example also demonstrated that the application of
estimation theory to complex, real-world biological systems
is not as straightforward as it may seem from purely theoretical studies.
In conclusion, this thesis demonstrates that estimation theory provides
a unified framework for the study of neural codes for both static and
time-varying stimuli.
In addition, it has successfully applied this framework to
aspects of neural coding and derived results of general importance
for our current picture of neural representation of stimulus information.
It is hoped that this work is of value for both experimental
and theoretical neuroscientists.