2017
DOI: 10.3390/e19090486
|View full text |Cite
|
Sign up to set email alerts
|

Use of the Principles of Maximum Entropy and Maximum Relative Entropy for the Determination of Uncertain Parameter Distributions in Engineering Applications

Abstract: Abstract:The determination of the probability distribution function (PDF) of uncertain input and model parameters in engineering application codes is an issue of importance for uncertainty quantification methods. One of the approaches that can be used for the PDF determination of input and model parameters is the application of methods based on the maximum entropy principle (MEP) and the maximum relative entropy (MREP). These methods determine the PDF that maximizes the information entropy when only partial in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 34 publications
(10 citation statements)
references
References 36 publications
0
10
0
Order By: Relevance
“…While the term ‘interaction’ can acquire different meanings in different cases, the MaxEnt distribution focuses on the most relevant part of the correlations and is therefore capable of extracting a more reliable interaction structure from data than that obtainable via a more standard correlation analysis. This property lies in our view at the heart of the success encountered by the maximum entropy method in several applications, in biology as well as in other fields [72] , [73] , [74] . The above example also shows the centrality of empirical data for maximum entropy inference.…”
Section: Main Textmentioning
confidence: 98%
“…While the term ‘interaction’ can acquire different meanings in different cases, the MaxEnt distribution focuses on the most relevant part of the correlations and is therefore capable of extracting a more reliable interaction structure from data than that obtainable via a more standard correlation analysis. This property lies in our view at the heart of the success encountered by the maximum entropy method in several applications, in biology as well as in other fields [72] , [73] , [74] . The above example also shows the centrality of empirical data for maximum entropy inference.…”
Section: Main Textmentioning
confidence: 98%
“…i.e., all moments of T exist. According to the maximum entropy principle [29,30], our task is to figure out the f (t) that enables the maximum of H(T) under Equations (2)-(4). Obviously, it is a conditional variational problem.…”
Section: Periodic Distribution Function Based On the Maximum Entropy mentioning
confidence: 99%
“…The values of wave height and period related to the single variable wave heights, and the period when the recurrence period is 5, 10, 20, 50, 100, 200 and 500 years are obtained by Equations (29) and (30), and the corresponding joint recurrence periods are obtained by Equation (33) (see Table 4).…”
Section: Double Entropy Joint Distribution Function and Engineering Amentioning
confidence: 99%
“…The principle of ME is to extract meaningful constraints that predicate the observed signals originated by the system. Following the concept of Jaynes (1963) since 1957, if the probability distribution function (PDF) of a given parameter X, being continuous distribution, is unknown and some parts of the parameter distribution are known, we can adopt the ME algorithm as demonstrated by Muoz-Cobo et al (2017) to obtain the parameter distribution. In the case of X taking a compact aspect being [a; b], with b > a, the Shannon information entropy is expressed:…”
Section: Maximum Entropy (Me) Principlementioning
confidence: 99%