2013
DOI: 10.3390/e15051587
|View full text |Cite
|
Sign up to set email alerts
|

Function Identification in Neuron Populations via Information Bottleneck

Abstract: It is plausible to hypothesize that the spiking responses of certain neurons represent functions of the spiking signals of other neurons. A natural ensuing question concerns how to use experimental data to infer what kind of a function is being computed. Model-based approaches typically require assumptions on how information is represented. By contrast, information measures are sensitive only to relative behavior: information is unchanged by applying arbitrary invertible transformations to the involved random … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(12 citation statements)
references
References 13 publications
0
12
0
Order By: Relevance
“…The LDPC code used in simulation is a (8000,7200) regular code with d c = 51 and d v = 5, and generated by PEG [37]. We assume that a decoding process is performed in a block with one upper page and one lower page, and LLR are calculated as (23) (24). The maximum number of decoding iterations is set to 50.…”
Section: Numerical Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The LDPC code used in simulation is a (8000,7200) regular code with d c = 51 and d v = 5, and generated by PEG [37]. We assume that a decoding process is performed in a block with one upper page and one lower page, and LLR are calculated as (23) (24). The maximum number of decoding iterations is set to 50.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…After that, several IB algorithmic approaches were proposed, such as agglomerative IB (Agg-IB), sequential IB (Seq-IB), deterministic IB (Det-IB) and Kullback-Leibler-means IB (KL-means-IB) [22]. Since its invention, IB has widely applied in different fields, such as documents classification [23], neuroscience [24], deep learning [25] and LDPC decoding [18].…”
Section: Introductionmentioning
confidence: 99%
“…This rule is designed to preserve so-called relevant mutual information I(X; T) ≤ I(X; Y), where X is a properly chosen relevant random variable of interest. The method is very generic and has numerous applications, for example, in image and speech processing, in astronomy and in neuroscience [2][3][4].…”
Section: Introductionmentioning
confidence: 99%
“…Information Bottleneck (IB) method for data compression was introduced by Tishby et al in [21], where the key idea is to compress the observation while the output preserves most of the information of the relevant variables, i.e., the original source. Since then, various IB methods have been rapidly developed and utilized in many fields, such as neuroscience, image processing, and deep learning [22,23,24,25]. The partitioning principle of it is usually divided into soft [21] and hard [26] partitions of original source.…”
Section: Introductionmentioning
confidence: 99%
“…The partitioning principle of it is usually divided into soft [21] and hard [26] partitions of original source. Kartik et al [22] developed an approach based on IB that attempts to find functional relationships in a neuron population. New image segmentation algorithms based on the hard version of the information bottleneck theory are presented in [23].…”
Section: Introductionmentioning
confidence: 99%