2019
DOI: 10.1609/aaai.v33i01.33013705
|View full text |Cite
|
Sign up to set email alerts
|

AFS: An Attention-Based Mechanism for Supervised Feature Selection

Abstract: As an effective data preprocessing step, feature selection has shown its effectiveness to prepare high-dimensional data for many machine learning tasks. The proliferation of high dimension and huge volume big data, however, has brought major challenges, e.g. computation complexity and stability on noisy data, upon existing feature-selection techniques. This paper introduces a novel neural network-based feature selection architecture, dubbed Attention-based Feature Selection (AFS). AFS consists of two detachabl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
48
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 56 publications
(48 citation statements)
references
References 16 publications
0
48
0
Order By: Relevance
“…Afterwards, they analysed the selected features for any relation, however this required prior knowledge on the subject, which is not always readily available. A variation of this concept by Ning Gui et al [7], employs a detachable NN between the input and the model, where they attached an attention network to a trained NN. The detachable NN, the attention network, selects features from the input using the correlation within to generate probability for each feature.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…Afterwards, they analysed the selected features for any relation, however this required prior knowledge on the subject, which is not always readily available. A variation of this concept by Ning Gui et al [7], employs a detachable NN between the input and the model, where they attached an attention network to a trained NN. The detachable NN, the attention network, selects features from the input using the correlation within to generate probability for each feature.…”
Section: Related Workmentioning
confidence: 99%
“…The basis of the method is from Yifeng et al [16] and Ning Gui, et al [7], introducing a sparse layer between the input and the first hidden layer of a neural network, to then perform additional processing for the model to select a subset of features during training. Our approach expands upon such in two areas: feature analysis and feature selection.…”
Section: Deep Feature and Sensor Selectionmentioning
confidence: 99%
See 2 more Smart Citations
“…Then, we utilise this information, so that we can better exploit the existent information, to better inform the decision making of the model. So in this paper, we are trying to show another side of these models, the one that focuses on the attention mechanism, and thus extracting the feature importance of the data Ning Gui (2019).…”
Section: Introductionmentioning
confidence: 99%