2022
DOI: 10.1038/s41598-022-14916-1
|View full text |Cite
|
Sign up to set email alerts
|

Neural network based successor representations to form cognitive maps of space and language

Abstract: How does the mind organize thoughts? The hippocampal-entorhinal complex is thought to support domain-general representation and processing of structural knowledge of arbitrary state, feature and concept spaces. In particular, it enables the formation of cognitive maps, and navigation on these maps, thereby broadly contributing to cognition. It has been proposed that the concept of multi-scale successor representations provides an explanation of the underlying computations performed by place and grid cells. Her… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 91 publications
0
6
0
Order By: Relevance
“…This interesting finding, previously reported in Schilling et al 15 , suggests that unsupervised dimensionality reduction could be used to automatically detect and enhance natural clustering in unlabeled data. In combination with automatic labeling methods, such as Gaussian Mixture Models 41 , or the concept of the successor representations to map complex data structures 24 , 42 , this may provide an objective way to define ’natural kinds’ in arbitrary data sets.…”
Section: Discussion and Outlookmentioning
confidence: 99%
See 1 more Smart Citation
“…This interesting finding, previously reported in Schilling et al 15 , suggests that unsupervised dimensionality reduction could be used to automatically detect and enhance natural clustering in unlabeled data. In combination with automatic labeling methods, such as Gaussian Mixture Models 41 , or the concept of the successor representations to map complex data structures 24 , 42 , this may provide an objective way to define ’natural kinds’ in arbitrary data sets.…”
Section: Discussion and Outlookmentioning
confidence: 99%
“…By color-coding each projected data point of a data set according to its label, the representation of the data can be visualized as a set of point clusters. For instance, MDS has already been applied to visualize for instance word class distributions of different linguistic corpora 23 , hidden layer representations (embeddings) of artificial neural networks 6,15,24 , structure and dynamics of recurrent neural networks [25][26][27][28] , or brain activity patterns assessed during e.g. pure tone or speech perception 14,23,29 , or even during sleep 5,30 .…”
Section: Control Of Class Separation By Quantity Smentioning
confidence: 99%
“…This automatically rules out the contention that semantic inquiry in the two paradigms taps into distinct properties of linguistic meaning. On the empirical front, there is a good amount of neuroscientific evidence that suggests that semantic representations are often map-like or frame-like and such representations can be implemented by neural networks in the entorhinal-hippocampal complex (Covington & Duff 2016;Piai et al 2016;Stoewer et al, 2022). Such map-like representations are also organized around concepts of objects and things that form the basis of semantic representations in cortical columns (see Hawkins, Ahmad, & Cui, 2017;Lewis, Purdy, Ahmad, & Hawkins, 2019).…”
Section: Remaining Ontological Issuesmentioning
confidence: 99%
“…The more complex nature of problem identification is also born out in neuroscientific studies. Just as multiple-choice questions are easier to answer than open-ended questions, involving fewer complex networks in the brain (Zhang et al, 2021), the retrieval of information to answer a question is less neurologically complex than formulating one (Stoewer et al, 2022). As a teaching strategy, Rothstein and Santana encourage us to Make Just One Change: Teach Students to Ask Their Own Questions Rothstein and Santana (2011).…”
Section: Identifying and Resolving Problemsmentioning
confidence: 99%