2006 IEEE International Symposium on Information Theory 2006
DOI: 10.1109/isit.2006.261867
|View full text |Cite
|
Sign up to set email alerts
|

Efficient representation as a design principle for neural coding and computation

Abstract: Does the brain construct an efficient representation of the sensory world? We review progress on this question, focusing on a series of experiments in the last decade which use fly vision as a model system in which theory and experiment can confront each other. Although the idea of efficient representation has been productive, clearly it is incomplete since it doesn't tell us which bits of sensory information are most valuable to the organism. We suggest that an organism which maximizes the (biologically meani… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
65
0

Year Published

2008
2008
2015
2015

Publication Types

Select...
4
4
1

Relationship

2
7

Authors

Journals

citations
Cited by 69 publications
(66 citation statements)
references
References 19 publications
1
65
0
Order By: Relevance
“…Although there has been much interest in the brain's ability to predict particular things, our approach emphasizes that prediction is a general problem, which can be stated in a unified mathematical structure across many contexts, from the extrapolation of trajectories to the learning of rules (20). Our results on the efficient representation of predictive information in the retina thus may hint at a much more general principle.…”
Section: Discussionmentioning
confidence: 94%
“…Although there has been much interest in the brain's ability to predict particular things, our approach emphasizes that prediction is a general problem, which can be stated in a unified mathematical structure across many contexts, from the extrapolation of trajectories to the learning of rules (20). Our results on the efficient representation of predictive information in the retina thus may hint at a much more general principle.…”
Section: Discussionmentioning
confidence: 94%
“…If one adopts the view that organisms tend to implement an information parsimony principle (Laughlin 2001;Polani 2009), then this implies that biological systems will exhibit a tendency to achieve a given level of performance at the lowest informational cost possible (or perform as well as possible under a given informational bandwidth). In our formalism, this would correspond to operating close to the optimal reward/information (strictly spoken, decision complexity) trade-off curve, always assuming that a suitable reward function can be formulated (Taylor et al 2007;Bialek et al 2007).…”
Section: Soft Vs Sharp Policiesmentioning
confidence: 99%
“…→ X] has been studied by several authors and given different names, such as (in chronological order) convergence rate of the conditional entropy [27], excess entropy [28], stored information [29], effective measure complexity [30], past-future mutual information [31], and predictive information [32], amongst others. For a review see Ref.…”
Section: Causal Statesmentioning
confidence: 99%