2021
DOI: 10.3390/e23030266
|View full text |Cite
|
Sign up to set email alerts
|

Fault Detection Based on Multi-Dimensional KDE and Jensen–Shannon Divergence

Abstract: Weak fault signals, high coupling data, and unknown faults commonly exist in fault diagnosis systems, causing low detection and identification performance of fault diagnosis methods based on T2 statistics or cross entropy. This paper proposes a new fault diagnosis method based on optimal bandwidth kernel density estimation (KDE) and Jensen–Shannon (JS) divergence distribution for improved fault detection performance. KDE addresses weak signal and coupling fault detection, and JS divergence addresses unknown fa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 22 publications
0
2
0
Order By: Relevance
“…After computing crosscorrelation coefficients, the level of similarity between observed and simulated catalogs is further investigated using Kullback-Leibler divergence and Jensen-Shannon divergence tests along with two-sample Kolmogorov-Smirnov statistics [41]. The Kullback-Leibler (KL) divergence (relative entropy) and Jensen-Shannon both provide a quantitative measure of proximity between two probability distributions and are used in a wide range of applications [46][47][48][49]. Tests can be used jointly to provide a more complete view of similarity between compared distributions [50], although studies suggest a higher efficiency and flexibility of the Jensen-Shannon test [51,52].…”
Section: Data Availability Statementmentioning
confidence: 99%
“…After computing crosscorrelation coefficients, the level of similarity between observed and simulated catalogs is further investigated using Kullback-Leibler divergence and Jensen-Shannon divergence tests along with two-sample Kolmogorov-Smirnov statistics [41]. The Kullback-Leibler (KL) divergence (relative entropy) and Jensen-Shannon both provide a quantitative measure of proximity between two probability distributions and are used in a wide range of applications [46][47][48][49]. Tests can be used jointly to provide a more complete view of similarity between compared distributions [50], although studies suggest a higher efficiency and flexibility of the Jensen-Shannon test [51,52].…”
Section: Data Availability Statementmentioning
confidence: 99%
“…Targeting the challenges regarding weak fault signal, coupling among different dimensions of the collected signal, and scarcity of fault datasets, Wei et al proposed a novel fault detection based on multi-dimensional KDE and Jensen–Shannon divergence [ 5 ]. Addressing the limitations of the conventional KDE method regarding information loss for multidimensional problems, in this research, it was extended to a multidimensional version for tackling the weak fault signal and coupling problem of the collected signal.…”
Section: Introductionmentioning
confidence: 99%