2002
DOI: 10.1016/s0957-4174(02)00016-7
|View full text |Cite
|
Sign up to set email alerts
|

Learning rules from incomplete training examples by rough sets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2006
2006
2020
2020

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 93 publications
(22 citation statements)
references
References 14 publications
0
22
0
Order By: Relevance
“…No additional information or statistical assumption is needed (Goh & Law, 2003;Su & Hsu, 2006). The RST has been successfully applied in a variety of fields such as: business failure prediction (Ahn, Cho, & Kim, 2000;Beynon & Peel, 2001;Dimitras, Slowinski, Susmaga, & Zopounidis, 1999;Slowinski & Zopounidis, 1995), rough neural expert system (Yahia, Mahmod, Sulaiman, & Ahmad, 2000), maximally general fuzzy rules (Hong, Wang, Wang, & Chien, 2000), customer and product fragmentation (Changchien & Lu, 2001), rules from incomplete training examples (Hong, Tseng, & Wang, 2002), stock price mining (Wang, 2003), hierarchical decision rules from clinical databases (Tsumoto, 2003), case-based reasoning application (Huang & Tseng, 2004), travel pattern generation (Witlox & Tindemans, 2004), credit scoring (Ong, Huang, & Tzeng, 2005), bank credit ratings (Griffiths & Beynon, 2005), rule discovery from noisy data (Wang, 2005), group decision (Huang, Ong, & Tzeng, 2006), classification rules (Tsai, Cheng, & Chang, 2006), customer relationship management (Tseng & Huang, 2007), insurance market (Shyng, Wang, Tzeng, & Wu, 2007), drug utilization knowledge (Chou, Cheng, & Chang, 2007), supplier selection (Xia & Wu, 2007), location based services (Sikder & Gangopadhyay, 2007), neighborhood classifiers (Hu, Yu, & Xie, 2008) cross-level certain and possible rules (Hong, Lin, Lin, & Wang, 2008), feature selection (Chen, Tseng, & Hong, 2008), and so on. The basics of RST are explained below.…”
Section: Methodsmentioning
confidence: 99%
“…No additional information or statistical assumption is needed (Goh & Law, 2003;Su & Hsu, 2006). The RST has been successfully applied in a variety of fields such as: business failure prediction (Ahn, Cho, & Kim, 2000;Beynon & Peel, 2001;Dimitras, Slowinski, Susmaga, & Zopounidis, 1999;Slowinski & Zopounidis, 1995), rough neural expert system (Yahia, Mahmod, Sulaiman, & Ahmad, 2000), maximally general fuzzy rules (Hong, Wang, Wang, & Chien, 2000), customer and product fragmentation (Changchien & Lu, 2001), rules from incomplete training examples (Hong, Tseng, & Wang, 2002), stock price mining (Wang, 2003), hierarchical decision rules from clinical databases (Tsumoto, 2003), case-based reasoning application (Huang & Tseng, 2004), travel pattern generation (Witlox & Tindemans, 2004), credit scoring (Ong, Huang, & Tzeng, 2005), bank credit ratings (Griffiths & Beynon, 2005), rule discovery from noisy data (Wang, 2005), group decision (Huang, Ong, & Tzeng, 2006), classification rules (Tsai, Cheng, & Chang, 2006), customer relationship management (Tseng & Huang, 2007), insurance market (Shyng, Wang, Tzeng, & Wu, 2007), drug utilization knowledge (Chou, Cheng, & Chang, 2007), supplier selection (Xia & Wu, 2007), location based services (Sikder & Gangopadhyay, 2007), neighborhood classifiers (Hu, Yu, & Xie, 2008) cross-level certain and possible rules (Hong, Lin, Lin, & Wang, 2008), feature selection (Chen, Tseng, & Hong, 2008), and so on. The basics of RST are explained below.…”
Section: Methodsmentioning
confidence: 99%
“…For example, Morad, Svrcek, and McKay (2000) proposed an unsupervised method of learning probability density function parameters in the framework of mixture densities from incomplete data to find the maximum likelihood estimate of the missing values. Hong, Tseng, and Wang (2002) proposed a learning method based on rough sets. Their method first assumes a missing value to be any possible value, and it then gradually refines the value according to the incomplete lower and upper approximations derived from the given training example.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Then, the problem is converted into extracting and simplifying the rules in complete information system. Here, we refer to the method in [4]. For the above example, the rules can be extracted as follows …”
Section: )}}mentioning
confidence: 99%
“…Kryszkiewicz [2] used the indiscernibility relations to characterize incomplete data. In the sequel, some researchers made some modifications on indiscernibility relations, for example, Stefanoski [9] proposed the similar relations, Wang [3] proposed constrained indiscernibility relations and Tzung [4] proposed a algorithm which can simultaneously derive rules from incomplete data sets and estimate the missing values.…”
Section: Introductionmentioning
confidence: 99%