2021
DOI: 10.3390/math9050548
|View full text |Cite
|
Sign up to set email alerts
|

Qualitative Properties of Randomized Maximum Entropy Estimates of Probability Density Functions

Abstract: The problem of randomized maximum entropy estimation for the probability density function of random model parameters with real data and measurement noises was formulated. This estimation procedure maximizes an information entropy functional on a set of integral equalities depending on the real data set. The technique of the Gâteaux derivatives is developed to solve this problem in analytical form. The probability density function estimates depend on Lagrange multipliers, which are obtained by balancing the mod… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…Popkov [15] has formulated the problem of randomized maximum entropy estimation for the probability density function of random model parameters with real data and measurement noises. This estimation procedure maximizes an information entropy functional on a set of integral equalities depending on the real dataset.…”
Section: Papers Of the Special Issuementioning
confidence: 99%
See 1 more Smart Citation
“…Popkov [15] has formulated the problem of randomized maximum entropy estimation for the probability density function of random model parameters with real data and measurement noises. This estimation procedure maximizes an information entropy functional on a set of integral equalities depending on the real dataset.…”
Section: Papers Of the Special Issuementioning
confidence: 99%
“…Numerical simulation in physical, social, and life sciences [1][2][3][4]; • Modeling and analysis of complex systems based on mathematical methods and AI/ML approaches [5,6]; • Control problems in robotics [3,[7][8][9][10][11][12]]; • Design optimization of complex systems [13]; • Modeling in economics and social sciences [4,14]; • Stochastic models in physics and engineering [1,[15][16][17][18]; • Mathematical models in material science [19]; • High-performance computing for mathematical modeling [20].…”
mentioning
confidence: 99%
“…Wang and Gui [28] used the maximum likelihood and Bayesian methods to obtain the estimators of the entropy for a two-parameter Burr type XII distribution under progressive type-II censored data. Popkov [15] approached the problem of randomized maximum entropy estimation for the probability density function of random model parameters with real data and measurement noises.This estimation procedure maximizes an information entropy functional, taken from a set of integral equalities. The technique of the Gteaux derivatives was developed to solve the problem under an analytical form.…”
Section: Introductionmentioning
confidence: 99%