2015
DOI: 10.1239/aap/1449859804
|View full text |Cite
|
Sign up to set email alerts
|

On dynamic mutual information for bivariate lifetimes

Abstract: We consider dynamic versions of the mutual information of lifetime distributions, with a focus on past lifetimes, residual lifetimes and mixed lifetimes evaluated at different instants. This allows us to study multicomponent systems, by measuring the dependence in conditional lifetimes of two components having possibly different ages. We provide some bounds, and investigate the mutual information of residual lifetimes within the time-transformed exponential model (under both the assumptions of unbounded and tr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(10 citation statements)
references
References 29 publications
0
10
0
Order By: Relevance
“…This measure is also named 'past entropy' of X; it has been investigated in Di Crescenzo and Longobardi [28], Nanda and Paul [29], Kundu et al [30]. Other results and applications of these dynamic information measures can be found in Sachlas and Papaioannou [31], Kundu and Nanda [32], and Ahmadi et al [33].…”
Section: Results On Dynamic Differential Entropiesmentioning
confidence: 99%
“…This measure is also named 'past entropy' of X; it has been investigated in Di Crescenzo and Longobardi [28], Nanda and Paul [29], Kundu et al [30]. Other results and applications of these dynamic information measures can be found in Sachlas and Papaioannou [31], Kundu and Nanda [32], and Ahmadi et al [33].…”
Section: Results On Dynamic Differential Entropiesmentioning
confidence: 99%
“…It is a special case of a Ali‐Mikhail‐Haq copula. See for instance, Ahmadi et al, where it has been used in the analysis of the mutual information of random lifetimes. For the system lifetime T 1 , recalling the structure given in Table , n. 12, from , and , we have F¯T1(t)=1[1Ĉ(F¯(t),F¯(t))]2=11F¯(t)2F¯(t)2=ψ1(F¯(t)), where ψ1(v)=11v2v2,0<v<1. The CDF of V 1 = F ( T 1 ) is GV1false(vfalse)=1ψ1false(1vfalse), and hence, we get gV1false(vfalse)=ψ1false(1vfalse)=8vfalse(1+vfalse)3, 0< v <1.…”
Section: Systems With Dependent Componentsmentioning
confidence: 99%
“…It is symmetric in the arguments and more concentrated around the diagonal. Two spikes are visible at (0, 0) and (1,1). The CIR copula is plotted at three different values of the parameter γ in the other panels of the Figure. For very large γ, the noise is very small with respect to the drift, and the CIR copula closely resembles the Gaussian one.…”
Section: Comparison Of Different Copula Densitiesmentioning
confidence: 99%
“…[22,25,34]). They have found application in many different fields ranging from finance and insurance [12,15], to reliability [1,33], stochastic ordering [36], geophysics [42], neuroscience [2,3,20,23,35,41], statistics [19] and many more.…”
Section: Introductionmentioning
confidence: 99%