2019
DOI: 10.3389/fpsyg.2019.02673
|View full text |Cite
|
Sign up to set email alerts
|

Merge-Generability as the Key Concept of Human Language: Evidence From Neuroscience

Abstract: Ever since the inception of generative linguistics, various dependency patterns have been widely discussed in the literature, particularly as they pertain to the hierarchy based on “weak generation” – the so-called Chomsky Hierarchy. However, humans can make any possible dependency patterns by using artificial means on a sequence of symbols (e.g., computer programing). The differences between sentences in human language and general symbol sequences have been routinely observed, but the question as to why such … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

4
16
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 9 publications
(20 citation statements)
references
References 39 publications
4
16
0
Order By: Relevance
“…Even though we used a limited set of words and tested syntactic features like subject-verb agreement, the number of their combinations became infinite, thus requiring the acquisition of syntactic knowledge as opposed to explicit rule and pattern learning in general. This syntactic knowledge is crucially underlain by Mergegenerability 34 , the key concept of human language, which is in marked contrast with artificial rules in general.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Even though we used a limited set of words and tested syntactic features like subject-verb agreement, the number of their combinations became infinite, thus requiring the acquisition of syntactic knowledge as opposed to explicit rule and pattern learning in general. This syntactic knowledge is crucially underlain by Mergegenerability 34 , the key concept of human language, which is in marked contrast with artificial rules in general.…”
Section: Discussionmentioning
confidence: 99%
“…Activations in this region actually predicted "the Degree of Merger," i.e., the maximum depth of merged subtrees (called Mergers), in a sentence; the more binary syntactic nodes (or branches) the sentence had, the more active the region became 31,32 . It has been reported that there were differential activations when participants were exposed to natural versus artificial rules of sentences 33,34 . We showed localized activation in the left language areas including the L. IFG under the Natural condition, while under the Artificial condition, activation was observed in the bilateral LPMC, together with wide-spread regions 34 .…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…2017 ; Tanaka et al. 2017 , 2019 ). For the MRI data acquisition, the participant was in a supine position, and his or her head was immobilized inside the radiofrequency coil.…”
Section: Methodsmentioning
confidence: 99%
“…While a lot of generative work has been published on how to best formulate the basic combinatorial operation Merge (see Fukui, 2011;Tanaka et al, 2019 for recent empirical work), there is less research on the question of where the elements to be merged actually come from (or, more accurately, the existing research on Merge and its domains has other ways to frame the question; see section "The Numeration and Derivation Layering"). Because clauses and complex phrases contain words, the simplest suggestion would be that the domain of Merge is the Lexicon.…”
Section: Constructions In Minimalism and Their Functional Motivationmentioning
confidence: 99%