2021
DOI: 10.1016/j.asoc.2021.107353
|View full text |Cite
|
Sign up to set email alerts
|

Attribute reduction methods in fuzzy rough set theory: An overview, comparative experiments, and new directions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 90 publications
(9 citation statements)
references
References 122 publications
0
9
0
Order By: Relevance
“…An et al [31] developed a robust fuzzy rough set based on kNN granules and then followed the approximation expression to design a fuzzyrough-uncertainty-based semi-supervised feature selection. More information can be found in [32] that recalled several important foundations related to the model of fuzzy rough theory, and then investigated so many advanced corresponding attribute reduction schemes comprehensively.…”
Section: Related Workmentioning
confidence: 99%
“…An et al [31] developed a robust fuzzy rough set based on kNN granules and then followed the approximation expression to design a fuzzyrough-uncertainty-based semi-supervised feature selection. More information can be found in [32] that recalled several important foundations related to the model of fuzzy rough theory, and then investigated so many advanced corresponding attribute reduction schemes comprehensively.…”
Section: Related Workmentioning
confidence: 99%
“…The innovativeness of componentbased complex products has not been analyzed [14,15]. Optimization of the component product structure has not been explicitly performed [16][17][18][19]. The investment attractiveness of component-based innovative product projects was practically not considered [20][21][22].…”
Section: Analysis Of Recent Research and Publicationsmentioning
confidence: 99%
“…In the process of constructing the discernibility matrix, the core attributes are obtained by finding the minimum element in the discernibility matrix and identifying the reducts [25]. Heuristic algorithms [26]- [28] typically calculate the attribute dependency to determine the reducts. Examples include test-cost-sensitive attribute reduction [29] and attribute reduction based on the conditional information entropy [30].…”
Section: Introductionmentioning
confidence: 99%