2021
DOI: 10.31234/osf.io/yfzat
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

People adaptively use information to improve their internal states and external outcomes

Abstract: Information can strongly impact peoples’ affect, their level of uncertainty and their decisions. It is assumed that people seek information with the goal of improving all three. But are they successful at achieving this goal? Answering this question is important for assessing the impact of self-driven information consumption on people’s well-being. Here, over four experiments (total N = 518) we show that participants accurately predict the impact of information on their internal states (e.g., affect and cognit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 33 publications
0
1
0
Order By: Relevance
“…Here, we take advantage of the fact that our task can present full probability distributions of rewards to test between three hypothesized families of uncertainty measures (Fig 2A,B). One family of uncertainty measures depends only on the probabilities of the potential outcomes, most prominently Shannon entropy ( 51 ) and related quantities such as Kullback-Leibler divergence ( 52 ), which have been used to model information preferences ( 44, 53, 54 ) and many other forms of motivation, cognition, and neural computation ( 5559 ). A second family depends only on the magnitudes of potential outcomes, including the range, which has been proposed to regulate the dynamic range of neural activity ( 6063 ).…”
Section: Introductionmentioning
confidence: 99%
“…Here, we take advantage of the fact that our task can present full probability distributions of rewards to test between three hypothesized families of uncertainty measures (Fig 2A,B). One family of uncertainty measures depends only on the probabilities of the potential outcomes, most prominently Shannon entropy ( 51 ) and related quantities such as Kullback-Leibler divergence ( 52 ), which have been used to model information preferences ( 44, 53, 54 ) and many other forms of motivation, cognition, and neural computation ( 5559 ). A second family depends only on the magnitudes of potential outcomes, including the range, which has been proposed to regulate the dynamic range of neural activity ( 6063 ).…”
Section: Introductionmentioning
confidence: 99%