Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2014
DOI: 10.1145/2623330.2623635
|View full text |Cite
|
Sign up to set email alerts
|

Gradient boosted feature selection

Abstract: A feature selection algorithm should ideally satisfy four conditions: reliably extract relevant features; be able to identify non-linear feature interactions; scale linearly with the number of features and dimensions; allow the incorporation of known sparsity structure. In this work we propose a novel feature selection algorithm, Gradient Boosted Feature Selection (GBFS), which satisfies all four of these requirements. The algorithm is flexible, scalable, and surprisingly straight-forward to implement as it is… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
64
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
4
1

Relationship

1
9

Authors

Journals

citations
Cited by 148 publications
(64 citation statements)
references
References 31 publications
0
64
0
Order By: Relevance
“…The basic idea is to reduce the dimensionality of a large dataset by selecting a subset of representative features without substantial loss of information [5,7,24,39]. This problem has attracted substantial attention in the so-called high-dimensional regime, where it is typically assumed that only a small subset of features are relevant to a response.…”
Section: Related Workmentioning
confidence: 99%
“…The basic idea is to reduce the dimensionality of a large dataset by selecting a subset of representative features without substantial loss of information [5,7,24,39]. This problem has attracted substantial attention in the so-called high-dimensional regime, where it is typically assumed that only a small subset of features are relevant to a response.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, Xu et al [57] proposed a gradient boosted feature selection approach that is able to consider predefined group feature structures. However, the proposed approach employs the group structure information for controlled boosting only and, thus, components from the same multi-dimensional feature are, in general, preferred for selection.…”
Section: Related Workmentioning
confidence: 99%
“…Lasso is a widely used technique in different models, including logistic regression, support vector machines, and deep neural networks [92]. More recently, a feature selection based on gradient boosting [148] was proposed. Another feature selection method is based on the dependency between features and class labels.…”
Section: Fisher Kernel Approach Via Generative Modelsmentioning
confidence: 99%