2023
DOI: 10.1088/1748-0221/18/02/c02060
|View full text |Cite
|
Sign up to set email alerts
|

Emergent computations in trained artificial neural networks and real brains

Abstract: Synaptic plasticity allows cortical circuits to learn new tasks and to adapt to changing environments. How do cortical circuits use plasticity to acquire functions such as decision-making or working memory? Neurons are connected in complex ways, forming recurrent neural networks, and learning modifies the strength of their connections. Moreover, neurons communicate emitting brief discrete electric signals. Here we describe how to train recurrent neural networks in tasks like those used to train animals in neur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 82 publications
0
1
0
Order By: Relevance
“…Further evidence suggests that RCNs supersede deep learning-based models for temporal series prediction even on the verge of chaos [40]. Richer approaches aim to train the reservoir connections themselves and have been proven to be useful in understanding the dynamical properties of cortical networks [35], offering an interesting framework for similar use cases. The parameter values used in our experiments can be found in Table 1.…”
Section: Reservoir Computingmentioning
confidence: 99%
“…Further evidence suggests that RCNs supersede deep learning-based models for temporal series prediction even on the verge of chaos [40]. Richer approaches aim to train the reservoir connections themselves and have been proven to be useful in understanding the dynamical properties of cortical networks [35], offering an interesting framework for similar use cases. The parameter values used in our experiments can be found in Table 1.…”
Section: Reservoir Computingmentioning
confidence: 99%
“…As simple as this paradigm seem, there is further evidence that RCNs supersede state-of-the-art deep learning-based models for temporal series prediction even on the verge of chaos [53]. Interestingly, richer approaches aim to train the reservoir connections themselves to increase the have been proven to be useful to understand the dynamical properties of cortical networks exist (see [54] for an extensive review).…”
Section: Reservoir Computing Network (Rcns)mentioning
confidence: 99%