2023
DOI: 10.1007/s00500-023-08839-w
|View full text |Cite
|
Sign up to set email alerts
|

New mixed integer fractional programming problem and some multi-objective models for sparse optimization

Abstract: We propose a novel Mixed-Integer Nonlinear Programming (MINLP) model for sparse optimization based on the polyhedral k-norm. We put special emphasis on the application of sparse optimization in Feature Selection for Support Vector Machine (SVM) classification. We address the continuous relaxation of the problem, which comes out in the form of a fractional programming problem (FPP). In particular, we consider a possible way for tackling FPP by reformulating it via a DC (Difference of Convex) decomposition. We a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 42 publications
0
0
0
Order By: Relevance
“…Our primary purpose is to demonstrate the advantages of considering these single-objective models as MOP models. In multi-objective form, we can obtain a set of Pareto-optimal solutions instead of an optimal solution in a single-objective form [58][59][60][61], and then the decision maker can choose one of these solutions [58][59][60].…”
Section: Multi-objective Support Vector Machinementioning
confidence: 99%
See 1 more Smart Citation
“…Our primary purpose is to demonstrate the advantages of considering these single-objective models as MOP models. In multi-objective form, we can obtain a set of Pareto-optimal solutions instead of an optimal solution in a single-objective form [58][59][60][61], and then the decision maker can choose one of these solutions [58][59][60].…”
Section: Multi-objective Support Vector Machinementioning
confidence: 99%
“…Additionally, based on k-norm the following problem proposed in [58][59][60] for Sparse Optimization:…”
mentioning
confidence: 99%