2017
DOI: 10.1073/pnas.1703155114
|View full text |Cite
|
Sign up to set email alerts
|

Frequency-specific directed interactions in the human brain network for language

Abstract: The brain's remarkable capacity for language requires bidirectional interactions between functionally specialized brain regions. We used magnetoencephalography to investigate interregional interactions in the brain network for language while 102 participants were reading sentences. Using Granger causality analysis, we identified inferior frontal cortex and anterior temporal regions to receive widespread input and middle temporal regions to send widespread output. This fits well with the notion that these regio… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

16
88
2

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 118 publications
(106 citation statements)
references
References 43 publications
16
88
2
Order By: Relevance
“…Finally, while oscillatory power has been the focus of the present review, this is only one piece of the puzzle. A more complete understanding of the neurophysiology that gives rise to language comprehension will require integrating the findings reviewed here with research investigating how different brain regions interact (e.g., via connectivity metrics; Schoffelen et al, 2017). It will also require integrating these findings with other relevant neurophysiological phenomena (e.g., Friederici & Singer, 2015) like cross-frequency coupling (Jensen & Colgin, 2007).…”
Section: General Discussion and Conclusionmentioning
confidence: 99%
“…Finally, while oscillatory power has been the focus of the present review, this is only one piece of the puzzle. A more complete understanding of the neurophysiology that gives rise to language comprehension will require integrating the findings reviewed here with research investigating how different brain regions interact (e.g., via connectivity metrics; Schoffelen et al, 2017). It will also require integrating these findings with other relevant neurophysiological phenomena (e.g., Friederici & Singer, 2015) like cross-frequency coupling (Jensen & Colgin, 2007).…”
Section: General Discussion and Conclusionmentioning
confidence: 99%
“…ROIs were defined using a multi-modal parcellation from the Human Connectome Project (Supporting Figure 1, [47]). To obtain a single spatial filter for each ROI (right A1 and left A1 separately), we performed a principal components analysis on the concatenated filters of each ROI, multiplied by the sensor-level covariance matrix, and extracted the first component, see [48]. Broadband (0.5-250Hz) sensor-level data were multiplied by this spatial filter to obtain "virtual electrodes".…”
Section: Roi Definitionmentioning
confidence: 99%
“…Using magnetoencephalography (MEG) signals from 200 subjects, 55 we performed a quantitative assessment of the sensory modality independent brain 56 activity following word onset during sentence processing. The MEG data forms part 57 of a large publicly available dataset [40], and has been used in other publications as 58 well [28], [39], [24], [27]. We identified widespread left hemispheric involvement, starting 59 from 325 ms after word onset in the temporal lobe and rapidly spreading to anterior 60 areas.…”
mentioning
confidence: 95%
“…We used linearly constrained minimum variance beamforming (LCMV) [46] FieldTrip's singleshell method [34], where the required brain/skull boundary was obtained 156 from the subject-specific T1-weighted anatomical images. We further reduced the 157 dimensionality of the data to 191 parcels per hemisphere [39]. For each parcel, we 158 obtained a parcel-specific spatial filter as follows: We concatenated the spatial filters 159 of the dipoles comprising the parcel, and used the concatenated spatial filter to obtain 160 a set of time courses of the reconstructed signal at each parcel.…”
mentioning
confidence: 99%
See 1 more Smart Citation