“…. , M, are assumed to be equal its mean value (ϕ i ) = 0, µ i s (t) ≡ µ s (t), 1 ≤ s ≤ 4, moreover, taking into account stochastic representations of log-skew elliptical random vectors [39], the expressions for univariate and multivariate Shannon entropies (measured in nats) take the following forms [40]: σ 22 ,α 3 ,β 3 ,γ 3 ,σ 33 ,α 4 ,β 4 ,γ 4 ,σ 44 ,σ 12 ,σ 13 ,σ 14 ,σ 23 ,σ 24 ,σ 34 As we can see from Equation 45, the mutual information, I, is calculated directly by summing the individual entropies and subtracting the joint entropy. Mutual information, I, between two random variables, X s and X u , compares the uncertainty of measuring variables jointly with the uncertainty of measuring the two variables independently, identifies nonlinear dependence between two variables [41][42][43], and is non-negative and symmetrical.…”