2021
DOI: 10.3390/e23060778
|View full text |Cite
|
Sign up to set email alerts
|

Global Sensitivity Analysis Based on Entropy: From Differential Entropy to Alternative Measures

Abstract: Differential entropy can be negative, while discrete entropy is always non-negative. This article shows that negative entropy is a significant flaw when entropy is used as a sensitivity measure in global sensitivity analysis. Global sensitivity analysis based on differential entropy cannot have negative entropy, just as Sobol sensitivity analysis does not have negative variance. Entropy is similar to variance but does not have the same properties. An alternative sensitivity measure based on the approximation o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 75 publications
(114 reference statements)
0
4
0
Order By: Relevance
“…For instance, entropy-based methods and methods that consider the entire probability density function are favorable when there is a strong skewness within the output distribution or when the output is multi-modal. 65,66 There was, however, no strong skewness found in our outputs of interest, so there is a strong indication that the variance is a good proxy for the uncertainty of our model. Nonetheless, one should consider this before applying SSA to other applications.…”
Section: Verification Of Ssa Approachmentioning
confidence: 76%
See 1 more Smart Citation
“…For instance, entropy-based methods and methods that consider the entire probability density function are favorable when there is a strong skewness within the output distribution or when the output is multi-modal. 65,66 There was, however, no strong skewness found in our outputs of interest, so there is a strong indication that the variance is a good proxy for the uncertainty of our model. Nonetheless, one should consider this before applying SSA to other applications.…”
Section: Verification Of Ssa Approachmentioning
confidence: 76%
“…However, other studies have indicated that other methods might be preferable if the variance is not a good measure. For instance, entropy‐based methods and methods that consider the entire probability density function are favorable when there is a strong skewness within the output distribution or when the output is multi‐modal 65,66 . There was, however, no strong skewness found in our outputs of interest, so there is a strong indication that the variance is a good proxy for the uncertainty of our model.…”
Section: Discussionmentioning
confidence: 99%
“…Entropy-based SA also belongs to the category of density-based SA, where uncertainty is characterized by examining the entire distribution of model outputs, not just its variance. The use of entropy instead of variance is usually justified by the need to analyze the output random variable with heavy-tail or outliers [23]. However, density-based indices are more difficult to implement than variance-based ones, owing to the fact that their computation demands the understanding of a large number of conditional PDFs [21].…”
Section: Distribution-based Approachmentioning
confidence: 99%
“…In some of our previous works [9,35,36], the problem of the sensitivity analysis of output model characteristics to the distributions of the input characteristics was considered. The papers of Z. Kala presented a general approach to the problems of sensitivity analysis [37,38].…”
Section: Introductionmentioning
confidence: 99%