2014
DOI: 10.1080/03610926.2013.851219
|View full text |Cite
|
Sign up to set email alerts
|

On Estimating the Residual Rényi Entropy under Progressive Censoring

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…For more information about other applications of Shannon entropy, see Asadi et al, Cover and Thomas, Ebrahimi et al, among others. Dynamic and bivariate versions can be seen in Asadi et al, Chamany and Baratpour, Jomhoori and Yousefzadeh, Navarro et al, and references therein. Another useful measure to obtain the distance between 2 density functions f and g is the Kullback‐Leibler (KL) distance, defined by Kfalse(f:gfalse)=0ffalse(xfalse)normallnormalonormalg3ptffalse(xfalse)gfalse(xfalse)dx=Hfalse(ffalse)+Hfalse(f,gfalse), where Hfalse(f,gfalse)=Effalse[normallnormalonormalg4ptgfalse(Xfalse)false] is known as “Fraser information” (Kent) and is also known as “inaccuracy measure” (Kerridge).…”
Section: Introductionmentioning
confidence: 99%
“…For more information about other applications of Shannon entropy, see Asadi et al, Cover and Thomas, Ebrahimi et al, among others. Dynamic and bivariate versions can be seen in Asadi et al, Chamany and Baratpour, Jomhoori and Yousefzadeh, Navarro et al, and references therein. Another useful measure to obtain the distance between 2 density functions f and g is the Kullback‐Leibler (KL) distance, defined by Kfalse(f:gfalse)=0ffalse(xfalse)normallnormalonormalg3ptffalse(xfalse)gfalse(xfalse)dx=Hfalse(ffalse)+Hfalse(f,gfalse), where Hfalse(f,gfalse)=Effalse[normallnormalonormalg4ptgfalse(Xfalse)false] is known as “Fraser information” (Kent) and is also known as “inaccuracy measure” (Kerridge).…”
Section: Introductionmentioning
confidence: 99%