2019
DOI: 10.1162/netn_a_00092
|View full text |Cite
|
Sign up to set email alerts
|

Large-scale directed network inference with multivariate transfer entropy and hierarchical statistical testing

Abstract: Network inference algorithms are valuable tools for the study of large-scale neuroimaging datasets. Multivariate transfer entropy is well suited for this task, being a model-free measure that captures nonlinear and lagged dependencies between time series to infer a minimal directed network model. Greedy algorithms have been proposed to efficiently deal with high-dimensional datasets while avoiding redundant inferences and capturing synergistic effects. However, multiple statistical comparisons may inflate the … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
151
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 95 publications
(153 citation statements)
references
References 71 publications
2
151
0
Order By: Relevance
“…More specifically, we expect that the tail of the out-degree distribution would be fattened, and the rich-club coefficient [49] may be underestimated. These implications also apply to iterative or greedy algorithms based on multivariate TE [50][51][52][53][54], since they rely on computing the pairwise TE as a first step. TE beyond the effect of the in-degrees.…”
Section: Discussionmentioning
confidence: 99%
“…More specifically, we expect that the tail of the out-degree distribution would be fattened, and the rich-club coefficient [49] may be underestimated. These implications also apply to iterative or greedy algorithms based on multivariate TE [50][51][52][53][54], since they rely on computing the pairwise TE as a first step. TE beyond the effect of the in-degrees.…”
Section: Discussionmentioning
confidence: 99%
“…Functional connectivity estimated by transfer entropy. In order to quantify compute functional connectivity (the amount of cross-regional interaction) we used transfer entropy (TE) as a directed information-theoretic metric for estimating the relationship between the past activity state of a source region and the present activity state of a target region (27,29,30). In its basic form, the Bivariate TE (BTE) between variables and is defined as the Mutual Information (MI) between the current value of , and the past value of −1 , conditioned on the immediate past values of −1 .…”
Section: Methodsmentioning
confidence: 99%
“…We examined mesoscale functional connectivity on the two relevant timescales: throughout learning and within trials. We estimated functional connectivity using transfer entropy (TE) (27)(28)(29). TE is a directed measure of connectivity, which estimates functional connectivity based on information-theoretic measures of regional activity distributions across trials and time steps.…”
Section: Stimulus-related Functional Connectivity Grows During Learningmentioning
confidence: 99%
“…where I(· : ·|·) is the conditional mutual information and Y + , Y − , X − are, respectively, a future random variable of the process Y, a vector of suitably chosen past random variables of the past of that process, and a suitably chosen vector of past random variables of the process X (see [1,[8][9][10][11] for considerations on the correct choice of the past random variables). TE measures the amount of information transferred between a single source and a single target process.…”
Section: Background Technical Background: Transfer Entropy and Multivmentioning
confidence: 99%
“…Computing the mT E tot in equation 2 exactly is an NP-hard problem [14], and approximations are necessary for practical use. This problem was recently addressed in [15,16], with the implementation of an approximate greedy algorithm in the IDT xl toolbox, which allows a large-scale directed network inference with mTE [11] and is freely available from GitHub (https://github.com/pwollstadt/IDTxl). IDTxl performs a greedy algorithm with an iterative sequence of statistical steps to infer the 'relevant' sources of the network, thus reducing the dimensionality of the problem, and allows to properly construct the nonuniform embedding of source and target time-series [17,18], i.e.…”
Section: Background Technical Background: Transfer Entropy and Multivmentioning
confidence: 99%