2021
DOI: 10.1007/s10994-021-05998-5
|View full text |Cite
|
Sign up to set email alerts
|

ReliefE: feature ranking in high-dimensional spaces via manifold embeddings

Abstract: Feature ranking has been widely adopted in machine learning applications such as high-throughput biology and social sciences. The approaches of the popular Relief family of algorithms assign importances to features by iteratively accounting for nearest relevant and irrelevant instances. Despite their high utility, these algorithms can be computationally expensive and not-well suited for high-dimensional sparse input spaces. In contrast, recent embedding-based methods learn compact, low-dimensional representati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(9 citation statements)
references
References 38 publications
0
9
0
Order By: Relevance
“… RL, MSF, and SF: The Relief algorithm is a popular and effective feature selection method in machine learning. It was originally proposed by Kira and Rendell in 1992 [ 27 , 28 ]. It is among the most widely used non-myopic algorithms for feature ranking, where each feature is assigned a real-valued score, offering insights into its importance [ 28 ].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“… RL, MSF, and SF: The Relief algorithm is a popular and effective feature selection method in machine learning. It was originally proposed by Kira and Rendell in 1992 [ 27 , 28 ]. It is among the most widely used non-myopic algorithms for feature ranking, where each feature is assigned a real-valued score, offering insights into its importance [ 28 ].…”
Section: Methodsmentioning
confidence: 99%
“…It was originally proposed by Kira and Rendell in 1992 [ 27 , 28 ]. It is among the most widely used non-myopic algorithms for feature ranking, where each feature is assigned a real-valued score, offering insights into its importance [ 28 ]. Extensions and variations of the Relief algorithm have been proposed to address specific challenges or enhance its performance.…”
Section: Methodsmentioning
confidence: 99%
“…The minimum value of the factor is zero when none of reducts in the set contains the given variable. Therefore, unlike some other ranking mechanisms, such as Relief [ 31 ], with the proposed weighting factor, some attributes can be considered irrelevant and assigned the rank of zero. The maximum value would be found for all reducts in the set being of the same length l , and all including the attribute.…”
Section: Proposed Methodologymentioning
confidence: 99%
“…All variables are considered as relevant but of course with observed degrees of relevance. Relief and its variants are popular examples of this category of methods [ 31 ]. The second group of approaches can consider some features as completely irrelevant and assign to them the rank of zero.…”
Section: Preliminariesmentioning
confidence: 99%
“…In this section we explore the use of attention-based neural networks mechanism for estimating feature importance in domain adaptation. This section is inspired from the seminal works on the attention mechanism [3,34,29]. We took inspiration from [29] regarding feature importance, with a different implementation of the attention mechanism, we defined it as follows:…”
Section: Propositional Self-attention Feature Importancementioning
confidence: 99%