2017
DOI: 10.1007/s12559-017-9466-4
|View full text |Cite
|
Sign up to set email alerts
|

Orthogonal Echo State Networks and Stochastic Evaluations of Likelihoods

Abstract: We report about probabilistic likelihood estimates that are performed on time series using an echo state network with orthogonal recurrent connectivity. The results from tests using synthetic stochastic input time series with temporal inference indicate that the capability of the network to infer depends on the balance between input strength and recurrent activity. This balance has an influence on the network with regard to the quality of inference from the short term input history versus inference that accoun… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 21 publications
0
3
0
Order By: Relevance
“…Interesting applications of arithmetic encoders can be found in, e.g., [30], [31], [32], [33]. Recent investigations have revealed an analogy between arithmetic encoders and optimal ESNs [6], [34], [9], [35], [5], [36]. Making use of this idea, the connectivity in an RNN would be arranged such that for the given input statistics in U, the network performs a data compression so that as many as possible past input values are represented in the reservoir.…”
Section: Rnns and Their Fractal Dimensionmentioning
confidence: 99%
See 1 more Smart Citation
“…Interesting applications of arithmetic encoders can be found in, e.g., [30], [31], [32], [33]. Recent investigations have revealed an analogy between arithmetic encoders and optimal ESNs [6], [34], [9], [35], [5], [36]. Making use of this idea, the connectivity in an RNN would be arranged such that for the given input statistics in U, the network performs a data compression so that as many as possible past input values are represented in the reservoir.…”
Section: Rnns and Their Fractal Dimensionmentioning
confidence: 99%
“…O NE of the key problems for Recurrent Neural Network (RNN) initialization, and in particular for Reservoir Computing (RC) methods like Echo State Networks (ESNs) [1], [2], [3] is that it is still unclear what kind of connectivity results in the best performance. For recurrent neural networks, some heuristics have shown much better performance than others: Dependent on the task, ESNs with orthogonal recurrent connectivity matrices that scale with the size of the network have shown better performance than other strategies [4], [5], [6]. An initialization method that can create a good recurrent weight matrix for a given task would be a compromise between an arbitrarily chosen connectivity matrix, and a connectivity matrix found by more expensive optimization, e.g., by backpropagation through time.…”
Section: Introductionmentioning
confidence: 99%
“…entries drawn from uniform or Gaussian distributions [31,32]. However, in the literature it is possible to find reservoirs with different connection patterns, including deterministic topologies [50] and those exploiting the norm-preserving property of orthogonal matrices [36]. In our case the read-out matrix W o ∈ R No×Nr is optimised for the task at hand.…”
Section: Echo State Networkmentioning
confidence: 99%