2021
DOI: 10.1063/5.0055371
|View full text |Cite
|
Sign up to set email alerts
|

Initializing LSTM internal states via manifold learning

Abstract: We present an approach, based on learning an intrinsic data manifold, for the initialization of the internal state values of long short-term memory (LSTM) recurrent neural networks, ensuring consistency with the initial observed input data. Exploiting the generalized synchronization concept, we argue that the converged, “mature” internal states constitute a function on this learned manifold. The dimension of this manifold then dictates the length of observed input time series data required for consistent initi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
references
References 40 publications
0
0
0
Order By: Relevance