1998
DOI: 10.1007/s004770050024
|View full text |Cite
|
Sign up to set email alerts
|

Minimum relative entropy and probabilistic inversion in groundwater hydrology

Abstract: The similarity between maximum entropy (MaxEnt) and minimum relative entropy (MRE) allows recent advances in probabilistic inversion to obviate some of the shortcomings in the former method. The purpose of this paper is to review and extend the theory and practice of minimum relative entropy. In this regard, we illustrate important philosophies on inversion and the similarly and differences between maximum entropy, minimum relative entropy, classical smallest model (SVD) and Bayesian solutions for inverse prob… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
32
0

Year Published

2000
2000
2022
2022

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 51 publications
(32 citation statements)
references
References 43 publications
0
32
0
Order By: Relevance
“…It is important to note that the errors produced are a reflection of the errors in the measurements translated through the appropriate nonlinear kernel that governs the problem along with the regularizer. These errors do not reflect the spread of reasonably probable models, which is one of the main goals in inverse problems for site characterization [Woodbury and Ulrych, 1998]. …”
Section: Inference Solutions To Ill-posed Problemsmentioning
confidence: 99%
See 1 more Smart Citation
“…It is important to note that the errors produced are a reflection of the errors in the measurements translated through the appropriate nonlinear kernel that governs the problem along with the regularizer. These errors do not reflect the spread of reasonably probable models, which is one of the main goals in inverse problems for site characterization [Woodbury and Ulrych, 1998]. …”
Section: Inference Solutions To Ill-posed Problemsmentioning
confidence: 99%
“…This viewpoint is essentially Bayesian and is readily applicable to the questions that scientists and engineers typically ask. A necessary component of the Jaynes-Cox view is the "principle of maximum entropy" (PME) which replaces the need for subjective prior information in the Bayesian approach and forces all observers who possess common information to produce consistent results [Woodbury and Ulrych, 1998]. It is important to note that the above approach (PME) of determiningp (in) is the one which is the most uncommitted with respect to unknown information.…”
Section: Inference Solutions To Ill-posed Problemsmentioning
confidence: 99%
“…minimum and maximum values of the sample), and the prior is often the uniform distribution on aY b (the least informative prior). A particularly¯exible method of incorporating prior information is by means of the principle of minimum relative entropy, that has found important application in the inversion of linear problems of general interest (see Woodbury and Ulrych, 1998 for a complete review).…”
Section: Maximum Entropy Density Estimationmentioning
confidence: 99%
“…This method presumes knowledge of a prior probability distribution and produces the posterior probability distribution based on the information provided by new moments. Details of the minimum relative entropy are found in [16].…”
Section: Introductionmentioning
confidence: 99%
“…The use of maximum entropy in hydrology is somewhat limited to evaluate the efficiency of monitoring networks [14] such as a water quality network [15]. Woodbury and Ulrych [16] developed a different approach of entropy estimation called minimum relative entropy for groundwater models. In the minimum relative entropy, knowledge of moments is used as "data" rather than sample values.…”
Section: Introductionmentioning
confidence: 99%