2016
DOI: 10.1016/j.neucom.2016.01.086
|View full text |Cite
|
Sign up to set email alerts
|

Model-coupled autoencoder for time series visualisation

Abstract: We present an approach for the visualisation of a set of time series that combines an echo state network with an autoencoder. For each time series in the dataset we train an echo state network, using a common and fixed reservoir of hidden neurons, and use the optimised readout weights as the new representation. Dimensionality reduction is then performed via an autoencoder on the readout weight representations. The crux of the work is to equip the autoencoder with a loss function that correctly interprets the r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(6 citation statements)
references
References 12 publications
0
6
0
Order By: Relevance
“…We mention in passing that the parameter configuration used for the proposed method was an ESN with a hidden reservoir of 50 neurons using the cycle architecture (Rodan & Tiňo 2012) coupled to an autoencoder with a hidden layer of 10 neurons. More information on the parameters, can be found in Gianniotis et al (2015). For the plain autoencoder, we also used a hidden layer of 10 neurons.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…We mention in passing that the parameter configuration used for the proposed method was an ESN with a hidden reservoir of 50 neurons using the cycle architecture (Rodan & Tiňo 2012) coupled to an autoencoder with a hidden layer of 10 neurons. More information on the parameters, can be found in Gianniotis et al (2015). For the plain autoencoder, we also used a hidden layer of 10 neurons.…”
Section: Resultsmentioning
confidence: 99%
“…A more detailed account of the adopted methodology in this work has previously appeared in Gianniotis et al (2015). The terms time-series and sequence are used interchangeably.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…It has attracted a renewed interest lately with deep network approaches generally utilizing restricted Boltzmann machines [35]. There has been recent work done on time series visualization utilizing autoencoder structures [36]. Time series forecasting with neural networks is reported to be advantageous even with relatively small data cases [37].…”
Section: Related Workmentioning
confidence: 99%