2017
DOI: 10.1162/neco_a_00991
|View full text |Cite
|
Sign up to set email alerts
|

Implementing a Bayes Filter in a Neural Circuit: The Case of Unknown Stimulus Dynamics

Abstract: In order to interact intelligently with objects in the world, animals must first transform neural population responses into estimates of the dynamic, unknown stimuli that caused them. The Bayesian solution to this problem is known as a Bayes filter, which applies Bayes' rule to combine population responses with the predictions of an internal model. The internal model of the Bayes filter is based on the true stimulus dynamics, and in this note, we present a method for training a theoretical neural circuit to ap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 63 publications
(128 reference statements)
0
7
0
Order By: Relevance
“…Our framework is particularly relevant for probabilistic theories of neural coding based on the theory of exponential families ( Beck et al, 2007 ), which include theories that address the linearity of Bayesian inference in neural circuits ( Ma et al, 2006 ), the role of phenomena such as divisive normalization in neural computation ( Beck et al, 2011a ), Bayesian inference about dynamic stimuli ( Makin et al, 2015 ; Sokoloski, 2017 ), and the metabolic efficiency of neural coding ( Ganguli and Simoncelli, 2014 ; Yerxa et al, 2020 ). These theories have proven difficult to validate quantitatively with neural data due to a lack of statistical models which are both compatible with their exponential family formulation, and can model correlated activity in recordings of large neural populations.…”
Section: Resultsmentioning
confidence: 99%
“…Our framework is particularly relevant for probabilistic theories of neural coding based on the theory of exponential families ( Beck et al, 2007 ), which include theories that address the linearity of Bayesian inference in neural circuits ( Ma et al, 2006 ), the role of phenomena such as divisive normalization in neural computation ( Beck et al, 2011a ), Bayesian inference about dynamic stimuli ( Makin et al, 2015 ; Sokoloski, 2017 ), and the metabolic efficiency of neural coding ( Ganguli and Simoncelli, 2014 ; Yerxa et al, 2020 ). These theories have proven difficult to validate quantitatively with neural data due to a lack of statistical models which are both compatible with their exponential family formulation, and can model correlated activity in recordings of large neural populations.…”
Section: Resultsmentioning
confidence: 99%
“…Our framework is particularly relevant for probabilistic theories of neural coding based on the theory of exponential families ( Beck et al, 2007 ), which include theories that address the linearity of Bayesian inference in neural circuits ( Ma et al, 2006 ), the role of phenomena such as divisive normalization in neural computation ( Beck et al, 2011a ), Bayesian inference about dynamic stimuli ( Makin et al, 2015 ; Sokoloski, 2017 ), and the metabolic efficiency of neural coding ( Ganguli and Simoncelli, 2014 ; Yerxa et al, 2020 ). These theories have proven difficult to validate quantitatively with neural data due to a lack of statistical models which are both compatible with their exponential family formulation (see Equation 4 ), and can model correlated activity in recordings of large neural populations.…”
Section: Discussionmentioning
confidence: 99%
“…Models of neural online inference usually seek to obtain the marginal p(z t |x 1:t ) [11,42] or, in addition, the pairwise joint p(z t-1 , z t |x 1:t ) [28]. However, postdiction requires updating all the latent variables z 1:t given each new observation x t .…”
Section: Dynamical Encoding Functionsmentioning
confidence: 99%