2007
DOI: 10.1109/tr.2007.895308
|View full text |Cite
|
Sign up to set email alerts
|

Testing Exponentiality Based on Kullback-Leibler Information With Progressively Type-II Censored Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
32
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 90 publications
(33 citation statements)
references
References 17 publications
0
32
0
Order By: Relevance
“…Large amount of following studies have successively introduced the concepts of the expansion of Hartley entropy and Shannon entropy [16], relative entropy [17], cumulative residual entropy [18][19][20][21], joint entropy [22,23], conditional entropy [24][25][26], mutual information [27][28][29][30][31][32], cross entropy [33][34][35][36][37][38], fuzzy entropy [15,39], maximum entropy principle [40,41] and minimum cross-entropy principle [42,43], and a series of achievements have been made in these aspects. Zhong makes use of general information functions to unify the methods of describing information metrics with Entropy formulas [4].…”
Section: About the Metrics Of Informationmentioning
confidence: 99%
“…Large amount of following studies have successively introduced the concepts of the expansion of Hartley entropy and Shannon entropy [16], relative entropy [17], cumulative residual entropy [18][19][20][21], joint entropy [22,23], conditional entropy [24][25][26], mutual information [27][28][29][30][31][32], cross entropy [33][34][35][36][37][38], fuzzy entropy [15,39], maximum entropy principle [40,41] and minimum cross-entropy principle [42,43], and a series of achievements have been made in these aspects. Zhong makes use of general information functions to unify the methods of describing information metrics with Entropy formulas [4].…”
Section: About the Metrics Of Informationmentioning
confidence: 99%
“…Some extensions of KL information have been studied by some authors including Park (2012), Park and Shin (2013) for the Type I censored distribution and Balakrishnan et al (2007) for the Type II progressively censored distribution. Barapour and Rad (2012) recently suggested a cumulative residual KL information, an extension of KL information to the survival function, as…”
Section: Scaled Cumulative Residual Kl Informationmentioning
confidence: 99%
“…For example, see D'Agostino and Stephens [8] and Huber-Carol et al [15]. Moreover, Ebrahimi [11,12], Balakrishnan et al [4,5], Park [22], Lim and Park [18], Lin et al [19], Habibi Rad et al [14], and Pakyari and Balakrishnan [20,21] developed some tests based on censored samples.…”
Section: Introductionmentioning
confidence: 99%