2018
DOI: 10.3390/sym10120696
|View full text |Cite
|
Sign up to set email alerts
|

Lower Approximation Reduction Based on Discernibility Information Tree in Inconsistent Ordered Decision Information Systems

Abstract: Attribute reduction is an important topic in the research of rough set theory, and it has been widely used in many aspects. Reduction based on an identifiable matrix is a common method, but a lot of space is occupied by repetitive and redundant identifiable attribute sets. Therefore, a new method for attribute reduction is proposed, which compresses and stores the identifiable attribute set by a discernibility information tree. In this paper, the discernibility information tree based on a lower approximation i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 34 publications
0
3
0
Order By: Relevance
“…The indiscernibility relation of the set of all condition attributes is called the equivalent class, which is then used to obtain a discernibility matrix [17], [18].…”
Section: Rough Set Algorithmmentioning
confidence: 99%
“…The indiscernibility relation of the set of all condition attributes is called the equivalent class, which is then used to obtain a discernibility matrix [17], [18].…”
Section: Rough Set Algorithmmentioning
confidence: 99%
“…At the same time, many formulas or methods were proposed to calculate the different types of attribute significances. Some classical formulas are designed based on the positive region [28][29][30], entropy [3,[16][17][18], the discernibility ability of attributes [13,14,24,31,32], the relationship between attributes [33], etc. In addition, many researchers proposed the mixed formulas by combining rough set theory and other theories, such as fuzzy set [12], ant colony optimization [23], granular computing [2,6,16,34], etc.…”
Section: Introductionmentioning
confidence: 99%
“…Considering the ensemble learning problem, Wang et al [17] introduced the forest optimization algorithm into the process of picking up reduct which can return multiple reducts, and used these reducts to develop an ensemble framework for executing voting classification over testing samples. Considering the monotonic classification problem, Zhang et al [18] applied the matrix approach for lower approximation in an inconsistent decision system to give the discriminative con-cept tree with the relations by dominance, and then fused the evaluation functions by tree approach to establish an efficient algorithm of searching lower approximation reduct. Considering the semi-supervised learning problem, Liu et al [19] proposed a semi-supervised attribute reduction approach that can handle the partially labeled data with label propagation algorithm and ensemble selector.…”
mentioning
confidence: 99%