2024
DOI: 10.1088/1361-6501/ad9f89
|View full text |Cite
|
Sign up to set email alerts
|

A lightweight and rapidly converging transformer based on separable linear self-attention for fault diagnosis

Kexin Yin,
Chunjun Chen,
Qi Shen
et al.

Abstract: Reaching reliable decisions on equipment maintenance is facilitated by the implementation of intelligent fault diagnosis techniques for rotating machineries. Recently, the Transformer model has demonstrated exceptional capabilities in global feature modeling for fault diagnosis tasks, garnering significant attention from the academic community. However, it lacks sufficient prior knowledge regarding rotation invariance, scale, and shift, necessitating pre-training on extensive datasets. In comparison, contempor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 27 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?