2016
DOI: 10.1016/j.sigpro.2015.08.008
|View full text |Cite
|
Sign up to set email alerts
|

Incipient fault amplitude estimation using KL divergence with a probabilistic approach

Abstract: International audienceThe Kullback–Leibler (KL) divergence is at the centre of Information Theory and change detection. It is characterized with a high sensitivity to incipient faults that cause unpredictable small changes in the process measurements. This work yields an analytical model based on the KL divergence to estimate the incipient fault magnitude in multivariate processes. In practice, the divergence has no closed form and it must be numerically approximated. In the particular case of incipient fault,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
18
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 36 publications
(18 citation statements)
references
References 38 publications
0
18
0
Order By: Relevance
“…Numerous studies [5,16,17] have mentioned that the incipient fault should be characterized with the following features:…”
Section: A Three-phasementioning
confidence: 99%
See 1 more Smart Citation
“…Numerous studies [5,16,17] have mentioned that the incipient fault should be characterized with the following features:…”
Section: A Three-phasementioning
confidence: 99%
“…The developed SFI in (13) has three advantages over other related works. Firstly, it is more effective in detecting the change in mean deviation, while the methods in [16,17] are not capable because they used the assumption that Mathematical Problems in Engineering 5 ( −̃) is 0. Secondly, it can determine the sign of the faulty parameter, which is useful in the subsequent fault isolation.…”
Section: Remarkmentioning
confidence: 99%
“…Harmouche et al used PCA prefilter before executing the density estimation algorithm; however, data representation in the principal component subspace is still suboptimal unless normality is satisfied. Conventional techniques for reducing dimensionality are based on linear data transformations, which enforce normality resulting in a significant loss of information.…”
Section: Introductionmentioning
confidence: 99%
“…Although, the density ratio was proven to be an optimal statistical test in inferential statistics . Kullback and Leibler introduced the Kullback‐Leibler divergence (KLD) , or the statistical mean of the density ratio with respect to 1 density, which has been shown to be a good alternative to other statistics in change detection, and its detection performance when applied to different types of faults has been shown in many applications . The algorihm issues an alarm every time a probability density function (PDF) change from its reference version has occurred, which means that a fault can be indirectly detected through the PDF divergence.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation