2014
DOI: 10.4236/ns.2014.67054
|View full text |Cite
|
Sign up to set email alerts
|

Discussing an Expected Utility and Weighted Entropy Framework

Abstract: In this paper, it is discussed a framework combining traditional expected utility and weighted entropy (EU-WE)-also named mean contributive value index-which may be conceived as a decision aiding procedure, or a heuristic device generating compositional scenarios, based on information theory concepts, namely weighted entropy. New proofs concerning the maximum value of the index and the evaluation of optimal proportions are outlined, with emphasis on the optimal value of the Lagrange multiplier and its meaning.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
7
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 33 publications
1
7
0
Order By: Relevance
“…This work is an analogous development of another paper recently published, where it was discussed an expected utility and weighted entropy framework, with acronym EU-WE [7]. We shall prove that the claimed analogy has here a proper sense, as either weighted Shannon entropy or weighted Gini-Simpson index may be considered two cases of generalized useful information measures.…”
Section: Introductionsupporting
confidence: 56%
See 2 more Smart Citations
“…This work is an analogous development of another paper recently published, where it was discussed an expected utility and weighted entropy framework, with acronym EU-WE [7]. We shall prove that the claimed analogy has here a proper sense, as either weighted Shannon entropy or weighted Gini-Simpson index may be considered two cases of generalized useful information measures.…”
Section: Introductionsupporting
confidence: 56%
“…First, we shall focus the discussion comparing optimal proportions of function u Z with index u K presented and discussed in [7]. The analogy stated in the introduction follows from the standard result that Taylor's first order (linear) approximation of the real function…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…According to basic principles of information theory, information is a measure of ordering degree in the system, but the entropy is a measure of disorder degree in the system [18][19]. Information augment means the entropy reduction.…”
Section: Entropy Weightmentioning
confidence: 99%
“…The real function W 1 was previously presented and discussed (see [43]); in summary, W 1 is a differentiable concave function in the interior of the simplex, attaining minimum and maximum values in the domain. The minimum value is minW 1 = min i=1,··· ,n u i and it is possible to locate the maximum point with a Lagrange multiplier method, the coordinates of the maximum point thus being evaluated by computing p * 1,i = exp(−α * /u i ) for i = 1, · · · , n, the optimal value of the Lagrange multiplier (α * ) defined implicitly by the equation α * : ∑ n i=1 exp(−α/u i ) = 1 which can be solved with numerical methods providing a unique solution.…”
Section: The Case For β =mentioning
confidence: 99%