2022
DOI: 10.1162/neco_a_01469
|View full text |Cite
|
Sign up to set email alerts
|

Understanding Memories of the Past in the Context of Different Complex Neural Network Architectures

Abstract: Deep learning (primarily using backpropagation) and neuroevolution are the preeminent methods of optimizing artificial neural networks. However, they often create black boxes that are as hard to understand as the natural brains they seek to mimic. Previous work has identified an information-theoretic tool, referred to as R, which allows us to quantify and identify mental representations in artificial cognitive systems. The use of such measures has allowed us to make previous black boxes more transparent. Here … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

3
2

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 0 publications
0
7
0
Order By: Relevance
“…In our previous studies [ 24 , 25 , 47 ], we demonstrated the significant influence of genetic algorithms on the distribution of representations, elucidating that evolved neural networks exhibit less information smearing. Despite our advancements, we have not identified a backpropagation method that exhibits similar effects, possibly due to the novelty of the concept of representational smearing.…”
Section: Discussionmentioning
confidence: 85%
See 1 more Smart Citation
“…In our previous studies [ 24 , 25 , 47 ], we demonstrated the significant influence of genetic algorithms on the distribution of representations, elucidating that evolved neural networks exhibit less information smearing. Despite our advancements, we have not identified a backpropagation method that exhibits similar effects, possibly due to the novelty of the concept of representational smearing.…”
Section: Discussionmentioning
confidence: 85%
“…In this study, we focus on the distribution of information within the neural network. Previous research has indicated that representations, or the information a neural network possesses about its environment, are dispersed throughout deep-learned networks [ 23 , 24 , 25 , 26 ]. In contrast, neural networks optimized using genetic algorithms do not exhibit this tendency and demonstrate greater robustness to noise [ 24 , 27 , 28 ].…”
Section: Introductionmentioning
confidence: 99%
“…In particular, we are coarse-graining those entropies by mapping all continuous values to the binary categories 0 and 1. We previously used the median value of a neuron’s excitation level as the threshold for the bin [ 45 ]. Instead, here the hidden-state values are clustered using a k -means clustering algorithm with .…”
Section: Methodsmentioning
confidence: 99%
“…We consider two different tasks. The first is a memorization task: the object is to read back a set of inputs with different delays (a task we call "n-Back" [16]). The second task is an active perception task (as pioneered by Beer [17,18]), where the agent has to make a decision on whether to catch or avoid a falling block based on its size and/or direction of motion.…”
Section: Information Fragmentationmentioning
confidence: 99%
“…The first task is n-Back [16], a simple memory problem where an agent is provided a sequence of binary digits through a single sensory node, one at a time. The inputs must be stored in memory and read out with particular delays.…”
Section: N-back Taskmentioning
confidence: 99%