Header

UZH-Logo

Maintenance Infos

Optimal Encoding in Stochastic Latent-Variable Models


Rule, Michael E; Sorbaro, Martino; Hennig, Matthias H (2020). Optimal Encoding in Stochastic Latent-Variable Models. Entropy, 22(7):714.

Abstract

In this work we explore encoding strategies learned by statistical models of sensory coding in noisy spiking networks. Early stages of sensory communication in neural systems can be viewed as encoding channels in the information-theoretic sense. However, neural populations face constraints not commonly considered in communications theory. Using restricted Boltzmann machines as a model of sensory encoding, we find that networks with sufficient capacity learn to balance precision and noise-robustness in order to adaptively communicate stimuli with varying information content. Mirroring variability suppression observed in sensory systems, informative stimuli are encoded with high precision, at the cost of more variable responses to frequent, hence less informative stimuli. Curiously, we also find that statistical criticality in the neural population code emerges at model sizes where the input statistics are well captured. These phenomena have well-defined thermodynamic interpretations, and we discuss their connection to prevailing theories of coding and statistical criticality in neural populations.

Abstract

In this work we explore encoding strategies learned by statistical models of sensory coding in noisy spiking networks. Early stages of sensory communication in neural systems can be viewed as encoding channels in the information-theoretic sense. However, neural populations face constraints not commonly considered in communications theory. Using restricted Boltzmann machines as a model of sensory encoding, we find that networks with sufficient capacity learn to balance precision and noise-robustness in order to adaptively communicate stimuli with varying information content. Mirroring variability suppression observed in sensory systems, informative stimuli are encoded with high precision, at the cost of more variable responses to frequent, hence less informative stimuli. Curiously, we also find that statistical criticality in the neural population code emerges at model sizes where the input statistics are well captured. These phenomena have well-defined thermodynamic interpretations, and we discuss their connection to prevailing theories of coding and statistical criticality in neural populations.

Statistics

Citations

Altmetrics

Downloads

1 download since deposited on 16 Feb 2021
1 download since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Neuroinformatics
Dewey Decimal Classification:570 Life sciences; biology
Scopus Subject Areas:Physical Sciences > General Physics and Astronomy
Uncontrolled Keywords:General Physics and Astronomy
Language:English
Date:28 June 2020
Deposited On:16 Feb 2021 09:29
Last Modified:01 Mar 2021 16:31
Publisher:MDPI Publishing
ISSN:1099-4300
OA Status:Gold
Free access at:PubMed ID. An embargo period may apply.
Publisher DOI:https://doi.org/10.3390/e22070714
PubMed ID:33286485

Download

Gold Open Access

Download PDF  'Optimal Encoding in Stochastic Latent-Variable Models'.
Preview
Content: Published Version
Filetype: PDF
Size: 6MB
View at publisher
Licence: Creative Commons: Attribution 4.0 International (CC BY 4.0)