2007
DOI: 10.1109/mci.2007.385366
|View full text |Cite
|
Sign up to set email alerts
|

Integrating Language and Cognition: A Cognitive Robotics Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
15
0

Year Published

2009
2009
2019
2019

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 39 publications
(16 citation statements)
references
References 17 publications
1
15
0
Order By: Relevance
“…In this way we can assign different weights to the components of the inputs. (8) given by Sek =3 e =1,2 and all k. This is the same effect reported in [44] in which once the inputs are classified into different categories (because of the linguistic component) small differences in the other entries are enhanced, producing a better discrimination of the non -linguistic features. Of course, since there is really no ambiguity in the triplet representation of the inputs, the word-biased NMF mechanism lumps into a single model (say k = 1) all triplets whose third component takes on the value 1, into model k = 2 the triplets whose third component is 2, and so on.…”
Section: A Object-word Input Representationsupporting
confidence: 67%
See 1 more Smart Citation
“…In this way we can assign different weights to the components of the inputs. (8) given by Sek =3 e =1,2 and all k. This is the same effect reported in [44] in which once the inputs are classified into different categories (because of the linguistic component) small differences in the other entries are enhanced, producing a better discrimination of the non -linguistic features. Of course, since there is really no ambiguity in the triplet representation of the inputs, the word-biased NMF mechanism lumps into a single model (say k = 1) all triplets whose third component takes on the value 1, into model k = 2 the triplets whose third component is 2, and so on.…”
Section: A Object-word Input Representationsupporting
confidence: 67%
“…The current predominant view, at least among linguists, is that language and thought are distinct abilities of the mind [4] and a quick comparison between the cognitive and linguistic abilities of apes and parrots appears to be a convincing argument for many researchers. In this contribution we carry further the rather ambitious research program of integrating language and cognition within the Neural Modeling Fields Framework (NMF) [5]- [8]. This is a task of enormous breadth that encompasses many unsolved (and, perhaps, unsolvable) problems such as object perception [9]- [11], symbol grounding which addresses the question of how physical signs can be given meaning [12]- [14], and the emergence of a common lexicon in a population of interacting agents [15]- [19].…”
Section: Introductionmentioning
confidence: 99%
“…As shown in [1], [2], [3], [4], [5] action and language develop in parallel, influence each other, and base themselves on each other. If brought into the world of robotics, the codevelopment of action and language skills might enable the transfer of properties of action knowledge to linguistic representations, and vice versa, thus enabling the synthesis of robots with complex behavioural and cognitive skills [6], [7].…”
Section: Introductionmentioning
confidence: 99%
“…In various experiments Cangelosi et al investigated the grounding of symbols in a computational model [5,6,7]. With the hypothesis that language can emerge from embodied interaction within an environment and a simultaneous exposure to words or "symbols", a number of simulations were conducted.…”
Section: Binding and Grounding In Computational Modelsmentioning
confidence: 99%