2010
DOI: 10.1007/978-3-642-15819-3_47
|View full text |Cite
|
Sign up to set email alerts
|

Echo State Networks with Sparse Output Connections

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
11
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(12 citation statements)
references
References 8 publications
1
11
0
Order By: Relevance
“…They investigate different greedy methods to this end, including backward selection (where connections are removed one at a time based on an iterative procedure), random deletion, and others. Similar experiments are conducted by Kobialka and Kayani [40].…”
Section: B Sparse Readouts For Esnssupporting
confidence: 52%
“…They investigate different greedy methods to this end, including backward selection (where connections are removed one at a time based on an iterative procedure), random deletion, and others. Similar experiments are conducted by Kobialka and Kayani [40].…”
Section: B Sparse Readouts For Esnssupporting
confidence: 52%
“…The usual benefits of using sparse regression algorithms are efficient suppression of noise, easier interpretation of the results, improved generalisation properties, reduced surplus of inputs (in this case readouts) and other [7].…”
Section: Sparse Linear Regressionmentioning
confidence: 99%
“…In [7] it is shown the use of feature selection algorithms (backward elimination and Markov blanket-embedded genetic algorithm (MBEGA)) for reduction of the number of required readouts can reduce the generalisation error of the ESN, i.e. it is desirable to reduce the overall number of readouts.…”
Section: Introductionmentioning
confidence: 99%
“…Not all dimensions may contribute to the solution. The internal layer of ESN is sparsely connected, hence, the fact that each output node is connected to all internal nodes seems contradictory [20]. Therefore, the output connection of ESN should be optimized.…”
Section: Introductionmentioning
confidence: 98%
“…[28]. Kobialka and Kayani use a greedy feature selection algorithm to exclude irrelevant internal ESN states [20]. The optimization of the connection structure of output weights is a problem of whether internal and output layer nodes are connected, thus, it is a discrete optimization problem.…”
Section: Introductionmentioning
confidence: 99%