2004
DOI: 10.1002/cav.34
|View full text |Cite
|
Sign up to set email alerts
|

An artificial life environment for autonomous virtual agents with multi‐sensorial and multi‐perceptive features

Abstract: Our approach is based on the multi-sensory integration of the standard theory of neuroscience, where signals of a single object coming from distinct sensory systems are combined. The acquisition steps of signals, filtering, selection and simplification intervening before proprioception, active and predictive perception are integrated into virtual sensors and a virtual environment. We will focus on two aspects: 1) the assignment problem: determining which sensory stimuli belong to the same virtual object and (2… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2005
2005
2010
2010

Publication Types

Select...
5
3
1

Relationship

3
6

Authors

Journals

citations
Cited by 23 publications
(12 citation statements)
references
References 16 publications
0
12
0
Order By: Relevance
“…Thus, there have been studies considering how to equip agents with synthetic vision, audition, and touch. [2][3][4] Based on perception, perceptual attention provides agents with a way to attend their environment. 5,6 Augmented reality (AR) that enables users to experience computer-generated content embedded in a real environment 7 ; AR-based agents can thus be visualized among physical objects in users' environment, and directly interact with users in real-time.…”
Section: Introductionmentioning
confidence: 99%
“…Thus, there have been studies considering how to equip agents with synthetic vision, audition, and touch. [2][3][4] Based on perception, perceptual attention provides agents with a way to attend their environment. 5,6 Augmented reality (AR) that enables users to experience computer-generated content embedded in a real environment 7 ; AR-based agents can thus be visualized among physical objects in users' environment, and directly interact with users in real-time.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, the behavior any artificial or biological system should follow to accomplish certain tasks (e.g., extraction, simplification and filtering), is strongly influenced by the data supplied by its sensors. This data is in turn dependent on the perception criteria associated with each sensorial input (Conde & Thalmann, 2004).…”
Section: Introductionmentioning
confidence: 99%
“…Previously, [5] have proposed new methodologies to carry out the mapping of all information coming from virtual sensors of vision, audition and touch as well as from the Virtual Environment (VE) in the form of a "cognitive map". The approach enables the partial re-mapping of cognitive and semantic information at a behavioural level.…”
Section: Introductionmentioning
confidence: 99%