2010
DOI: 10.1007/978-3-642-13033-5_5
|View full text |Cite
|
Sign up to set email alerts
|

Recognition and Generation of Sentences through Self-organizing Linguistic Hierarchy Using MTRNN

Abstract: Abstract. We show that a Multiple Timescale Recurrent Neural Network (MTRNN) can acquire the capabilities of recognizing and generating sentences by self-organizing a hierarchical linguistic structure. There have been many studies aimed at finding whether a neural system such as the brain can acquire languages without innate linguistic faculties. These studies have found that some kinds of recurrent neural networks could learn grammar. However, these models could not acquire the capability of deterministically… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…The first part of the project focuses on using the Long-Short Term Memory (LSTM) [6] recurrent neural network architecture for implementing, on an iCub humanoid robot [7] (depicted in Figure 1), the gesture-word combination stage in infants before and when transitioning to multi-word utterances. This method will be compared to that of using Multiple Timescale Recurrent Neural Networks (MTRNNs) [8], which will be developed during the second part of the project and focuses on the implementation and verification of constructivist theories for language learning.…”
Section: Figurementioning
confidence: 99%
“…The first part of the project focuses on using the Long-Short Term Memory (LSTM) [6] recurrent neural network architecture for implementing, on an iCub humanoid robot [7] (depicted in Figure 1), the gesture-word combination stage in infants before and when transitioning to multi-word utterances. This method will be compared to that of using Multiple Timescale Recurrent Neural Networks (MTRNNs) [8], which will be developed during the second part of the project and focuses on the implementation and verification of constructivist theories for language learning.…”
Section: Figurementioning
confidence: 99%