2000
DOI: 10.15760/etd.669
|View full text |Cite
|
Sign up to set email alerts
|

Reward-driven Training of Random Boolean Network Reservoirs for Model-Free Environments

Abstract: Reservoir Computing (RC) is an emerging machine learning paradigm where a fixed kernel, built from a randomly connected "reservoir" with sufficiently rich dynamics, is capable of expanding the problem space in a non-linear fashion to a higher dimensional feature space. These features can then be interpreted by a linear readout layer that is trained by a gradient descent method. In comparison to traditional neural networks, only the output layer needs to be trained, which leads to a significant computational ad… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2000
2000
2000
2000

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 49 publications
0
1
0
Order By: Relevance
“…Christensen and Oppacher presented a set of trees that e ciently search the solution space of Koza's GPs requiring less computational power [26]. The Santa Fe trail has also been used as a basis to prove implementation in reservoir computing [27].…”
Section: Je↵erson's Work Tested the Decision Making Capability Of Thementioning
confidence: 99%
“…Christensen and Oppacher presented a set of trees that e ciently search the solution space of Koza's GPs requiring less computational power [26]. The Santa Fe trail has also been used as a basis to prove implementation in reservoir computing [27].…”
Section: Je↵erson's Work Tested the Decision Making Capability Of Thementioning
confidence: 99%