2019
DOI: 10.19139/soic.v7i2.566
|View full text |Cite
|
Sign up to set email alerts
|

Variable Selection in Count Data Regression Model based on Firefly Algorithm

Abstract: Variable selection is a very helpful procedure for improving computational speed and prediction accuracy by identifying the most important variables that related to the response variable. Count data regression modeling has received much attention in several science fields in which the Poisson and negative binomial regression models are the most basic models. Firefly algorithm is one of the recently efficient proposed nature-inspired algorithms that can efficiently be employed for variable selection. In this wo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 25 publications
0
3
0
Order By: Relevance
“…We use the Python scipy.optimize module (meth-od='SLSQP') to solve the NLO problem by Eqs (20)- (22). We use Gurobi Optimizer 8.1.1 (https://www.gurobi.com/) to solve the MIQO problem by Eqs ( 13)- (17), and the indicator constraint to impose the logical implication of Eq (15). We fix the L 2 -regularization parameter to α = 0 in Tables 1, 2 and 6, whereas we tune it through hold-out validation using the training instances in Tables 3, 4 and 7.…”
Section: Eqlspc(h): Setting Equally Spaced Tangent Pointsmentioning
confidence: 99%
See 1 more Smart Citation
“…We use the Python scipy.optimize module (meth-od='SLSQP') to solve the NLO problem by Eqs (20)- (22). We use Gurobi Optimizer 8.1.1 (https://www.gurobi.com/) to solve the MIQO problem by Eqs ( 13)- (17), and the indicator constraint to impose the logical implication of Eq (15). We fix the L 2 -regularization parameter to α = 0 in Tables 1, 2 and 6, whereas we tune it through hold-out validation using the training instances in Tables 3, 4 and 7.…”
Section: Eqlspc(h): Setting Equally Spaced Tangent Pointsmentioning
confidence: 99%
“…In contrast, stepwise selection [15,16], which repeats addition and elimination of one explanatory variable at a time, is a practical method for sparse estimation. Several metaheuristic algorithms have been applied to subset selection for Poisson regression [17,18], and various regularization methods have been recently proposed for sparse Poisson regression [19][20][21][22]. Note, however, that these (non-exhaustive) sparse estimation methods are heuristic algorithms, which cannot verify optimality of an obtained subset of explanatory variables (e.g., in the maximum likelihood sense).…”
Section: Introductionmentioning
confidence: 99%
“…A genetic algorithm (GA) would be utilized as the optimization technique in order to reach the desired result. The use of this algorithm is similar to other metaheuristic algorithms such as firefly algorithm implemented by Algamal [2].…”
Section: Introductionmentioning
confidence: 99%