2019
DOI: 10.1016/j.neunet.2019.08.020
|View full text |Cite
|
Sign up to set email alerts
|

Self-organizing neural networks for universal learning and multimodal memory encoding

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 31 publications
(15 citation statements)
references
References 36 publications
0
15
0
Order By: Relevance
“…The higher accuracy is achieved thanks to a redundancy gain that reduces the amount of uncertainty in the resulting information. Recent works show a growing interest toward multi-sensory fusion in several application areas, such as developmental robotics (Droniou et al, 2015 ; Zahra and Navarro-Alarcon, 2019 ), audio-visual signal processing (Shivappa et al, 2010 ; Rivet et al, 2014 ), spatial perception (Pitti et al, 2012 ), attention-driven selection (Braun et al, 2019 ) and tracking (Zhao and Zeng, 2019 ), memory encoding (Tan et al, 2019 ), emotion recognition (Zhang et al, 2019 ), multi-sensory classification (Cholet et al, 2019 ), HMI (Turk, 2014 ), remote sensing and earth observation (Debes et al, 2014 ), medical diagnosis (Hoeks et al, 2011 ), and understanding brain functionality (Horwitz and Poeppel, 2002 ).…”
Section: Introductionmentioning
confidence: 99%
“…The higher accuracy is achieved thanks to a redundancy gain that reduces the amount of uncertainty in the resulting information. Recent works show a growing interest toward multi-sensory fusion in several application areas, such as developmental robotics (Droniou et al, 2015 ; Zahra and Navarro-Alarcon, 2019 ), audio-visual signal processing (Shivappa et al, 2010 ; Rivet et al, 2014 ), spatial perception (Pitti et al, 2012 ), attention-driven selection (Braun et al, 2019 ) and tracking (Zhao and Zeng, 2019 ), memory encoding (Tan et al, 2019 ), emotion recognition (Zhang et al, 2019 ), multi-sensory classification (Cholet et al, 2019 ), HMI (Turk, 2014 ), remote sensing and earth observation (Debes et al, 2014 ), medical diagnosis (Hoeks et al, 2011 ), and understanding brain functionality (Horwitz and Poeppel, 2002 ).…”
Section: Introductionmentioning
confidence: 99%
“…Density cluster: findClusters function in densityClust package [46] Hierarchical cluster: hclust function Som (self-organized map) [47]…”
Section: Data Availabilitymentioning
confidence: 99%
“…Multimodal data fusion is thus a direct consequence of the well-accepted paradigm that certain natural processes and phenomena are expressed under completely different physical guises [6]. Recent works show a growing interest toward multimodal association in several applicative areas such as developmental robotics [3], audio-visual signal processing [7,8], spatial perception [9,10], attention-driven selection [11] and tracking [12], memory encoding [13], emotion recognition [14], human-machine interaction [15], remote sensing and earth observation [16], medical diagnosis [17], understanding brain functionality [18], and so forth. Interestingly, the last mentioned application is our starting bloc: how does the brain handle multimodal learning in the natural environment?…”
Section: Introductionmentioning
confidence: 99%