2014
DOI: 10.3390/e16052713
|View full text |Cite
|
Sign up to set email alerts
|

Non-Extensive Entropy Econometrics: New Statistical Features of Constant Elasticity of Substitution-Related Models

Abstract: Power-law (PL) formalism is known to provide an appropriate framework for canonical modeling of nonlinear systems. We estimated three stochastically distinct models of constant elasticity of substitution (CES) class functions as non-linear inverse problem and showed that these PL related functions should have a closed form. The first model is related to an aggregator production function, the second to an aggregator utility function (the Armington) and the third to an aggregator technical transformation functio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 23 publications
0
9
0
Order By: Relevance
“…Particular attention should be paid while predicting the GDP components reflecting smaller proportions in the GDP structure. In fact, the Tsallis entropy estimators also belong to the family of the Stein estimators [16], indicating that smaller probabilities are shrunk and higher probabilities dominate in the solution space. Adding more consistency related information to less influential components will enhance parameter precision and then allow for recovering their influence while dampening that of bigger events.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Particular attention should be paid while predicting the GDP components reflecting smaller proportions in the GDP structure. In fact, the Tsallis entropy estimators also belong to the family of the Stein estimators [16], indicating that smaller probabilities are shrunk and higher probabilities dominate in the solution space. Adding more consistency related information to less influential components will enhance parameter precision and then allow for recovering their influence while dampening that of bigger events.…”
Section: Discussionmentioning
confidence: 99%
“…(3)- (8)) should increase the quality of the forecast in conformity with the maximum entropy principle connected with the Bayesian optimal processing rule. Finally, as far as the above parameter confidence area is concerned, we send the readers to [16]. It has been shown in [21] that computed indexes fulfil the basic Fisher-RaoCramer information index properties, including symmetry, continuity, additivity and maximum.…”
Section: The Q-generalized Kullback-leibler Information Divergence Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…1, which displays the different values (of the period 1998-2005) of the entropy coefficient change expressed in Eq. (7). In this case, the entropy value is smaller when we take the coefficients from the period 2005.…”
Section: Recovery Of Polish Regional Greenhouse Emissions By Industrymentioning
confidence: 99%
“…Literature on different methodologies dedicated to illposed inverse problem recovery is expansive. Besides the well-known Tikhonov regularization theory [4], GibbsShannon-Jaynes [5,6] maximum (minimum) entropy principle and its recent extensions [7,8] remain the most used techniques for solving this class of problem. The general principal which applies to both approaches is the association of a linear or nonlinear least square problem with a regularization term (a priori or additional information) so as to obtain a well-posed problem.…”
Section: Introductionmentioning
confidence: 99%