2017
DOI: 10.3390/e19100528
|View full text |Cite
|
Sign up to set email alerts
|

Generalized Skew-Normal Negentropy and Its Application to Fish Condition Factor Time Series

Abstract: Abstract:The problem of measuring the disparity of a particular probability density function from a normal one has been addressed in several recent studies. The most used technique to deal with the problem has been exact expressions using information measures over particular distributions. In this paper, we consider a class of asymmetric distributions with a normal kernel, called Generalized Skew-Normal (GSN) distributions. We measure the degrees of disparity of these distributions from the normal distribution… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(18 citation statements)
references
References 36 publications
(72 reference statements)
0
18
0
Order By: Relevance
“…Shannon entropy is a very important inferential measure to explain the variability or uncertainty of a random variable. The Shannon entropy for a random variable X with pdf f is given by (see [24,25]),…”
Section: Estimation Of the Parameters Of Asp With The Generalized Raymentioning
confidence: 99%
See 1 more Smart Citation
“…Shannon entropy is a very important inferential measure to explain the variability or uncertainty of a random variable. The Shannon entropy for a random variable X with pdf f is given by (see [24,25]),…”
Section: Estimation Of the Parameters Of Asp With The Generalized Raymentioning
confidence: 99%
“…The entropy measure of generalized Rayleigh distribution given by Equation (7) can be easily computed by using the (plug-in) estimators of the parameters obtained by the methods of ML, MMSP, MLS, MM, and MLM [24].…”
Section: Mls Estimationmentioning
confidence: 99%
“…In this subsection, we investigate the Shannon and Rényi entropy, which are the two most popular entropies for Power Lindley distribution. The Shannon entropy (SE) of a random variable X with pdf f is defined as, see [21],…”
Section: Shannon and Rényi Entropy Of The Power Lindley Distributionmentioning
confidence: 99%
“…The elements of the Fisher information matrix I given by (21) are immediately written from elements of the Hessian matrix H (θ). However, an explicit form of the Fisher information matrix I cannot be derived.…”
Section: Maximum Likelihood Estimationmentioning
confidence: 99%
“…The SRG distribution has as particular cases the Reflected-GZ and GZ distributions, when ε → 1 and ε → −1, respectively. The SRG distribution family can also represent a suitable competitor against the skew-normal (SN, [3]) and epsilon-skew-normal (ESN, [4]) distributions as a way to fit asymmetrical datasets. Indeed, refs.…”
Section: Introductionmentioning
confidence: 99%