2022
DOI: 10.1016/j.asoc.2022.109769
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting fuzzy rough mutual information for feature selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 15 publications
(3 citation statements)
references
References 35 publications
0
3
0
Order By: Relevance
“…Entire reduction process is accomplished over complete data by using both fuzzy and IF aided techniques. FSFrMI 72 , GIFRFS 57 , TIFRFS 59 , and FRFS 6 are the earlier efficacious and effective techniques, which are incorporated to perform the comparative results (Table 2 ). Our proposed method produced reduct set range from 7 to 169, where reduct size is smaller when compared to reduct size by earlier approaches, except bank marketing and thyroid-hypothyroid datasets.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Entire reduction process is accomplished over complete data by using both fuzzy and IF aided techniques. FSFrMI 72 , GIFRFS 57 , TIFRFS 59 , and FRFS 6 are the earlier efficacious and effective techniques, which are incorporated to perform the comparative results (Table 2 ). Our proposed method produced reduct set range from 7 to 169, where reduct size is smaller when compared to reduct size by earlier approaches, except bank marketing and thyroid-hypothyroid datasets.…”
Section: Resultsmentioning
confidence: 99%
“…Further, it can be stated that mutual information (MI) 71 is an interesting quantity that evaluates the dependence between conditional features and has been repeatedly employed to solve an extensive diverse problems. Feature selection techniques can be converted into effective one by incorporating information entropy estimation notion for attribute extraction based on MI 72 and the conventional feature selection approaches on the basis of class seperability. Broadly MI measures the amount of information that can be deduced from a random variable/vector about another random variable/vector 73 , 74 .…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation