2016
DOI: 10.1016/j.ress.2016.07.015
|View full text |Cite
|
Sign up to set email alerts
|

Jensen–Shannon information of the coherent system lifetime

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 23 publications
(32 citation statements)
references
References 32 publications
0
26
0
Order By: Relevance
“…They also provided expressions for the Kullback-Leibler discrimination information of a mixed system and component lifetimes. Asadi, Ebrahimi, Soofi, and Zohrevand [3] proposed the Jensen-Shannon (JS) information criteria for comparison of mixed systems, which is a scalar function of the signature and ranks systems based on their designs. They proved that the JS information is non-negative and its minimum is attained by r-out-of-n systems.…”
Section: )mentioning
confidence: 99%
“…They also provided expressions for the Kullback-Leibler discrimination information of a mixed system and component lifetimes. Asadi, Ebrahimi, Soofi, and Zohrevand [3] proposed the Jensen-Shannon (JS) information criteria for comparison of mixed systems, which is a scalar function of the signature and ranks systems based on their designs. They proved that the JS information is non-negative and its minimum is attained by r-out-of-n systems.…”
Section: )mentioning
confidence: 99%
“…The difference between the two sides of the inequality in gives the Jensen–Shannon (JS) divergence of the mixture error model that has the following representations: JS(fε:f1,f2,π)=H(ε)[πH(ε1)+(1π)H(ε2)]=πK(f1:fε)+(1π)K(f2:fε)0. The inequality becomes equality if and only if f k ( ε )= f ε ( ε ), k =1,2 almost everywhere, implying that there is no outlier. Using and an upper bound for JS given by Asadi et al (), we have the following bounds for the error entropy πH1+(1π)H2H(ε)πH1+(1π)H2+π(1π)J(f1,f2), where H k = H ( ε k ), k =1,2 and J ( f 1 , f 2 )= K ( f 1 : f 2 )+ K ( f 2 : f 1 ) is the Jeffreys divergence. Estimates of these bounds can be used to assess presence of outliers.…”
Section: Mixture Modelsmentioning
confidence: 99%
“…They also investigated the discrimination information between the entropy of system's lifetime with the parent distribution as well as the order statistics. In Asadi et al, a useful merge of the information theory and the mixture representation of the system reliability function has been presented. Specifically, they have shown that the Jensen‐Shannon divergence of the mixture distribution provides information criteria for comparing competing systems solely based on the designs of the systems while the common distribution of the i.i.d.…”
Section: Preliminaries and Notationsmentioning
confidence: 99%