2023
DOI: 10.3390/electronics12214450
|View full text |Cite
|
Sign up to set email alerts
|

Robust Feature Selection Method Based on Joint L2,1 Norm Minimization for Sparse Regression

Libo Yang,
Dawei Zhu,
Xuemei Liu
et al.

Abstract: Feature selection methods are widely used in machine learning tasks to reduce the dimensionality and improve the performance of the models. However, traditional feature selection methods based on regression often suffer from a lack of robustness and generalization ability and are easily affected by outliers in the data. To address this problem, we propose a robust feature selection method based on sparse regression. This method uses a non-square form of the L2,1 norm as both the loss function and regularizatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 42 publications
0
1
0
Order By: Relevance
“…Two common types of regularization are L1 regularization (Lasso) and L2 regularization (Ridge). For both L1 and L2 regularization, regularization typically ranges from 0 to positive infinity [8].…”
Section: Loss Functionmentioning
confidence: 99%
“…Two common types of regularization are L1 regularization (Lasso) and L2 regularization (Ridge). For both L1 and L2 regularization, regularization typically ranges from 0 to positive infinity [8].…”
Section: Loss Functionmentioning
confidence: 99%