2015
DOI: 10.1016/j.patrec.2015.05.006
|View full text |Cite
|
Sign up to set email alerts
|

A new method for constructing granular neural networks based on rule extraction and extreme learning machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 30 publications
0
5
0
Order By: Relevance
“…The inferred posterior class probabilities of bankruptcy were used to analyze the sensitivity of the classifier output and assist in the credit assignment decision-making process. A hybrid model of neural network and adaptive boosting method was studied by Xu et al (2015). Sexton and Mcmurtrey (2006) and Setiono et al (2009) studied a genetic algorithm-based neural network algorithm for credit card screening and found that the neural network rule extraction is very effective in discovering knowledge and is particularly appropriate in applications that require comprehensibility and accuracy.…”
Section: Financial Crisis and Risk Modelingmentioning
confidence: 99%
“…The inferred posterior class probabilities of bankruptcy were used to analyze the sensitivity of the classifier output and assist in the credit assignment decision-making process. A hybrid model of neural network and adaptive boosting method was studied by Xu et al (2015). Sexton and Mcmurtrey (2006) and Setiono et al (2009) studied a genetic algorithm-based neural network algorithm for credit card screening and found that the neural network rule extraction is very effective in discovering knowledge and is particularly appropriate in applications that require comprehensibility and accuracy.…”
Section: Financial Crisis and Risk Modelingmentioning
confidence: 99%
“…Granular Neural Networks (GNNs) were introduced in [1] as the amalgamation between Neural Networks and Granular Computing. Roughly speaking, this synergy aims at reconciling the black-box behavior and the lack of transparency of neural networks with the aid of information granules, such as classes, clusters, subsets, etc [2]. Through similarity among objects and granulation [3], GNNs bypass the use of large datasets and precise information, while building interpretable and lighter models.…”
Section: Introductionmentioning
confidence: 99%
“…Ahn et al proposed a hybrid intelligent system combining rough set with artificial neural network to predict the failure of firms [31] and a new reduction algorithm called 2D reduction was designed; in 2D reduction method, rough set is utilized to eliminate irrelevant or redundant attributes and then scans samples to delete the samples of inconsistent decisions; at last, association rules can be extracted from the data; for Ahn's hybrid classification model, if a new instance matches some association rules, the instance will be classified by the association rules; otherwise, the algorithm will use the data reduced by 2D reduction method to train a classifier to classify this instance. Xu et al introduced a rough rule granular extreme learning machine called RRGELM [32]; RRGELM uses rough set to extract association rules and the number of neurons in hidden layer is decided by the number of the association rules; the weights of input layer is not randomly generated, they are determined by the conditions whether instances are covered by the association rules or not. The above works have promoted the developments of rough set and neural network.…”
Section: Introductionmentioning
confidence: 99%
“…Traditional algorithms often separate rough sets and neural networks and do not fuse ELM and rough sets very well. Different from existing algorithms which only use rough set to reduce attributes or determine the number of neurons in hidden layer [31,32], RELM uses the results of the data divided by rough set to train different kinds of neurons; it preferably combines rough set with extreme learning machine.…”
Section: Introductionmentioning
confidence: 99%