2013
DOI: 10.1162/neco_a_00459
|View full text |Cite
|
Sign up to set email alerts
|

Combinatorial Neural Codes from a Mathematical Coding Theory Perspective

Abstract: Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 32 publications
(36 citation statements)
references
References 52 publications
0
36
0
Order By: Relevance
“…3f, details vs. time in Supplementary Fig. 3b), suggesting that a combinatorial coding scheme [62][63][64][65] underlies their decoding performance. Together, these findings allude to a neural code that depended little on the precise activity levels of neurons, but more on their identities out of a time-dependent subset of participating neurons.…”
Section: Multiple Present-and Past-trial Task Information Can Be Uniqmentioning
confidence: 99%
“…3f, details vs. time in Supplementary Fig. 3b), suggesting that a combinatorial coding scheme [62][63][64][65] underlies their decoding performance. Together, these findings allude to a neural code that depended little on the precise activity levels of neurons, but more on their identities out of a time-dependent subset of participating neurons.…”
Section: Multiple Present-and Past-trial Task Information Can Be Uniqmentioning
confidence: 99%
“…, and Lk 12 (∆| [4] ) is the non-contractible disconnected graph in Figure 4C. Thus, ({1, 2}, {3, 4}) is a local obstruction, and so C cannot be convex.…”
Section: Algebraic Signaturementioning
confidence: 99%
“…Similarly, ({1}, {2, 3, 4}) gives another local obstruction that is not detectable from the canonical form. Specifically, ({1}, {2, 3, 4}) ∈ RF(C) since U 1 ⊆ U 2 ∪ U 3 ∪ U 4 with U 1 ∩ U i = ∅ for each i = 2, 3, 4, and Lk 1 (∆| [4] ) is the non-contractible hollow simplex shown in Figure 4D. In fact, it turns out that every non-maximal σ ∈ ∆ has a related RF relationship that is a local obstruction (see [3, Table 2 in Supplementary Text S1]).…”
Section: Algebraic Signaturementioning
confidence: 99%
See 1 more Smart Citation
“…Interestingly, in exploring the implications of shifting focus from information theory to coding theory in terms of influence upon theoretical neuroscience, (Curto, Itskov et al 2013) have pointed to this same tradeoff, though their treatment uses error rate (coding accuracy) instead of storage capacity. We point out that understanding how neural correlation ultimately affects things like storage capacity is considered largely unknown and an active area of research (Latham 2017).…”
Section: Novel Explanation Of Noise and Correlationmentioning
confidence: 99%