2024
DOI: 10.1109/tnnls.2022.3201621
|View full text |Cite
|
Sign up to set email alerts
|

Slow Down to Go Better: A Survey on Slow Feature Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
7
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
9
1

Relationship

2
8

Authors

Journals

citations
Cited by 72 publications
(8 citation statements)
references
References 117 publications
1
7
0
Order By: Relevance
“…Here, we confirm their results numerically by using the potential depicted in Fig. 2c , but we also propose that the modified TAE can be extended to slow feature analysis (SFA) (Wiskott and Sejnowski, 2002 ; Berkes and Wiskott, 2005 ; Song and Zhao, 2022 ) (see Fig. 1b ), which can render the latent variables orthogonal in multidimensional cases, since Eq.…”
Section: Resultssupporting
confidence: 79%
“…Here, we confirm their results numerically by using the potential depicted in Fig. 2c , but we also propose that the modified TAE can be extended to slow feature analysis (SFA) (Wiskott and Sejnowski, 2002 ; Berkes and Wiskott, 2005 ; Song and Zhao, 2022 ) (see Fig. 1b ), which can render the latent variables orthogonal in multidimensional cases, since Eq.…”
Section: Resultssupporting
confidence: 79%
“…The expert prior knowledge comes from the expert diagnostic experience, which can be generalized to fault diagnosis tasks under various working conditions [18]. Research shows that the expert prior knowledge enables the model to learn more generalized fault features and improve the diagnostic accuracy of the model [18][19][20][21][22][23]. However, the expert prior knowledge is often limited and easily disturbed by noise.…”
Section: Introductionmentioning
confidence: 99%
“…A well-known implementation of this prior is Slow Feature Analysis (SFA), an unsupervised learning algorithm that reduces the dimensionality of its input by identifying slowly changing dimensions in the data [31, 39, 40]. SFA first isolates independent components in the data and then extracts those components that change slowly, under the premise that slower features are more meaningful representations of the data [31].…”
Section: Introductionmentioning
confidence: 99%