2022
DOI: 10.1609/aaai.v36i10.21427
|View full text |Cite
|
Sign up to set email alerts
|

Probing Word Syntactic Representations in the Brain by a Feature Elimination Method

Abstract: Neuroimaging studies have identified multiple brain regions that are associated with semantic and syntactic processing when comprehending language. However, existing methods cannot explore the neural correlates of fine-grained word syntactic features, such as part-of-speech and dependency relations. This paper proposes an alternative framework to study how different word syntactic features are represented in the brain. To separate each syntactic feature, we propose a feature elimination method, called Mean Vec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(12 citation statements)
references
References 26 publications
1
11
0
Order By: Relevance
“…In particular, the in silico paradigm can both draw upon large-scale multidisciplinary efforts to build tools and methods for interpreting neural network language models ( Ettinger et al, 2018 ; Hewitt & Manning, 2019 ; Ravfogel et al, 2020 ; Sundararajan et al, 2017 ), as well as contribute to them by providing a human neural benchmark. Furthermore, more interpretable models allow for novel causal intervention experiments that perturb and control ANNs in ways that biological neural networks cannot be perturbed ( Zhang et al, 2022 ).…”
Section: Discussionmentioning
confidence: 99%
“…In particular, the in silico paradigm can both draw upon large-scale multidisciplinary efforts to build tools and methods for interpreting neural network language models ( Ettinger et al, 2018 ; Hewitt & Manning, 2019 ; Ravfogel et al, 2020 ; Sundararajan et al, 2017 ), as well as contribute to them by providing a human neural benchmark. Furthermore, more interpretable models allow for novel causal intervention experiments that perturb and control ANNs in ways that biological neural networks cannot be perturbed ( Zhang et al, 2022 ).…”
Section: Discussionmentioning
confidence: 99%
“…The author believes that language-cognition experiments combined with computational models can eliminate the research limitations above. For example, using computational models can separate different experimental variables and study the role of different language variables and cognitive functions based on neural activity data collected from natural texts [136,137,138]. With the continuous improvement of the performance of languagecomputation methods based on neural network methods, it is increasingly accurate to use models to separate different language features so that the visual and auditory perception, multimodal information fusion, and language in different regions of the brain can be calculated on the same batch of data.…”
Section: Correlating Multiple Linguistic Variables and Cognitive Func...mentioning
confidence: 99%
“…(1) Basic Syntactic Features: Similar to [29,4,5], we use various multi-dimensional syntactic features such as Complexity Metrics (Node Count (NC), Word Length (WL), Word Frequency (WF)), Part-of-speech (POS) and Dependency tags (DEP), described briefly below. Node Count (NC) The node count for each word is the number of subtrees that are completed by incorporating each word into its sentence.…”
Section: Feature Representationsmentioning
confidence: 99%
“…The central aim of brain encoding for language processing analysis is to unravel how the brain represents linguistic knowledge (i.e. semantic and syntactic properties) and carries out sentence-processing information [1,2,3,4,5] by modeling the effect of such information on brain recordings. For instance, using functional Magnetic Resonance Imaging (fMRI) brain recordings, a number of previous studies have investigated the alignment between text stimuli representations extracted from language models (e.g.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation