2019
DOI: 10.1101/799064
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Efficient sampling and noisy decisions

Abstract: 16Noise and information sampling are ubiquitous components determining the precision 17 in our ability to discriminate between decision alternatives, however, the source and 18 nature of these imprecisions is unclear. Moreover, how the nervous system 19 simultaneously considers regularities in the environment, goals of the organism, and 20 capacity constraints to guide goal-directed behavior, remains unknown. To address 21 these issues, we elaborate a biologically-and cognitive-relevant efficient coding 22 … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
6
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 79 publications
0
6
0
Order By: Relevance
“…4 As emphasized in Ma and Woodford (2020), there are differences in the way that resource constraints are imposed across different models of efficient coding. While we use a specific constraint levied by the Heng et al (2020) model, the main prediction we test is qualitatively similar to other models of efficient coding in sensory perception that assume different constraints, such as Wei and Stocker (2015).…”
Section: Introductionmentioning
confidence: 70%
See 3 more Smart Citations
“…4 As emphasized in Ma and Woodford (2020), there are differences in the way that resource constraints are imposed across different models of efficient coding. While we use a specific constraint levied by the Heng et al (2020) model, the main prediction we test is qualitatively similar to other models of efficient coding in sensory perception that assume different constraints, such as Wei and Stocker (2015).…”
Section: Introductionmentioning
confidence: 70%
“…First, the foundation of our framework is the KLW model, which assumes that the DM observes noisy signals of lottery payoffs and subsequently forms optimal estimates of these payoffs through Bayesian inference. Second, we rely on the efficient coding model from Heng, Woodford, and Polanía (2020) (henceforth HWP) to endogenize the conditional distribution of noisy signals-which is called the "efficient code." As in KLW, our framework generates the probability of choosing a risky lottery over a certain option; but, crucially, by adding the efficient coding mechanism from HWP, we can assess how the probability of choosing the risky lottery varies with the environment.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Thus the statistics presented in the figures measure the degree of discrepancy with respect to this benchmark, for those trials on which the subject submits a (non-zero) bid. 17 In the case of each lottery, the dot indicates the mean value of log(W T P/EV ), pooling all subjects. The vertical whiskers mark an interval ±s around the mean, where s is the standard deviation of log(W T P ) for an "average" subject, computed as the mean of s.d.…”
Section: Resultsmentioning
confidence: 99%