2018
DOI: 10.1080/09540091.2017.1318357
|View full text |Cite
|
Sign up to set email alerts
|

Interactive natural language acquisition in a multi-modal recurrent neural architecture

Abstract: For the complex human brain that enables us to communicate in natural language, we gathered good understandings of principles underlying language acquisition and processing, knowledge about sociocultural conditions, and insights into activity patterns in the brain. However, we were not yet able to understand the behavioural and mechanistic characteristics for natural language and how mechanisms in the brain allow to acquire and process language. In bridging the insights from behavioural psychology and neurosci… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
20
2

Year Published

2018
2018
2022
2022

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 25 publications
(23 citation statements)
references
References 87 publications
1
20
2
Order By: Relevance
“…We argue that these conditions inherently enable the development of distributed representations of knowledge. For example, in our research, we found that architectural mechanisms, like different timings in the information processing in the cortex, foster compositionality that in turn enables both the development of more complex body actions and the development of language competence from primitives (Heinrich 2016). These kinds of distributed representations are coherent with the cognitive science on embodied cognition.…”
Section: Stefan Wermter Sascha Griffiths and Stefan Heinrichsupporting
confidence: 71%
“…We argue that these conditions inherently enable the development of distributed representations of knowledge. For example, in our research, we found that architectural mechanisms, like different timings in the information processing in the cortex, foster compositionality that in turn enables both the development of more complex body actions and the development of language competence from primitives (Heinrich 2016). These kinds of distributed representations are coherent with the cognitive science on embodied cognition.…”
Section: Stefan Wermter Sascha Griffiths and Stefan Heinrichsupporting
confidence: 71%
“…For instance, in Sugita and Tani (2005), such association learning occurring on the PB level binds the semantic and the behaviour representations. Similar association learning also can be found in Heinrich and Wermter (2018). On the other hand, the single RNN we use, although with more complexity in training, allows a higher generalisation abilities because all the modalities are learnt in a single dynamical system.…”
Section: Generalisation Analysesmentioning
confidence: 74%
“…The central idea is that continuous neural computation and predictive coding of sensorimotor information is crucial for internal representation formation. Recently, we can find many studies that apply deep learning scheme to multimodal human-robot interaction including linguistic modality [124]- [128]. Recent studies on artificial general intelligence based on DRL follow a similar idea.…”
Section: A Computational Models For Symbol Emergence and Cognitive Amentioning
confidence: 99%