2015
DOI: 10.5539/mas.v9n4p170
|View full text |Cite
|
Sign up to set email alerts
|

Adjusted Adaptive LASSO in High-dimensional Poisson Regression Model

Abstract: The LASSO has been widely studied and used in many applications, but it not shown oracle properties. Depending on a consistent initial parameters vector, an adaptive LASSO showed oracle properties, which it is consistent in variable selection and asymptotically normal in coefficient estimation. In Poisson regression model, the usual adaptive LASSO using maximum likelihood coefficient estimators can result in very poor performance when there is multicollinearity. In this study, we proposed an adjusting of the a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(5 citation statements)
references
References 25 publications
0
5
0
Order By: Relevance
“…In this section, "the same simulation settings of [29] and [4] For all the simulation examples 1-3, the response variable is generated according to PRM as y i ∼ P o(exp(x i T β)). Simulations 4-6 are the same as the setting of simulations 1 C 3 where the response variable is generated according to NBRM with conditional mean exp(x i T β) and τ = 2.…”
Section: Simulation Resultsmentioning
confidence: 99%
“…In this section, "the same simulation settings of [29] and [4] For all the simulation examples 1-3, the response variable is generated according to PRM as y i ∼ P o(exp(x i T β)). Simulations 4-6 are the same as the setting of simulations 1 C 3 where the response variable is generated according to NBRM with conditional mean exp(x i T β) and τ = 2.…”
Section: Simulation Resultsmentioning
confidence: 99%
“…In order to reduce the parameter to a very low variance, this term is added. The operation of Lasso regression is quite similar to ridge regression and helps to penalize the absolute size of the regression co-efficients [37]. In order to improve the accuracy and to mitigate the variability, the model is used and is expressed as argmin…”
Section: Modified Adaboostrt Algorithm Based On Ridge Lasso and Soft Thresholdingmentioning
confidence: 99%
“…Other penalized estimators, such as the LASSO estimator (Hossain & Ahmed, 2012), adaptive LASSO estimator (Algamal & Lee, 2015;Hossain & Ahmed, 2012;Ivanoff, Picard, & Rivoirard, 2016), elastic net estimator (Noori Asl, Bevrani, & Arabi Belaghi, 2022), and bridge estimator (Chowdhury, Chatterjee, Mallick, Banerjee, & Garai, 2019), have been proposed, but they also use zero as the shrinkage center and mainly focus on variable selection rather than limiting multicollinearity. The Jackknifed version and modified Jackknifed version of PRE, PLE, and PKLE (Oranye & Ugwuowo, 2022;Rasheed, Sadik, & Algamal, 2022;Turkan & Ozel, 2016) were also proposed to obtain an almost unbiased estimator by applying the Jackknifed procedure of bias reduction.…”
Section: Introductionmentioning
confidence: 99%