2016
DOI: 10.1016/j.eswa.2016.06.041
|View full text |Cite
|
Sign up to set email alerts
|

Constrained dynamic rule induction learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
10

Relationship

3
7

Authors

Journals

citations
Cited by 20 publications
(5 citation statements)
references
References 16 publications
0
5
0
Order By: Relevance
“…Learning software includes pattern and machine learning (ML), which embodies some of the aspects of the human mind that allow us to deal with an extremely complex solution to the problem with the speed overcoming even the fastest computers [48]. Wen et al [49] reported the most commonly used techniques: case-based reasoning (CBR) [50], artificial neural networks (ANN) decision trees (DT) [51], Bayesian networks (BN) [52], support vector regression (SVR) [53], genetic algorithms (GA) [54], genetic programming (GP) [55,56], association rules (AR) [57], rule induction (RI) [58], and fuzzy algorithms [59].…”
Section: Information Systems and Learning Softwarementioning
confidence: 99%
“…Learning software includes pattern and machine learning (ML), which embodies some of the aspects of the human mind that allow us to deal with an extremely complex solution to the problem with the speed overcoming even the fastest computers [48]. Wen et al [49] reported the most commonly used techniques: case-based reasoning (CBR) [50], artificial neural networks (ANN) decision trees (DT) [51], Bayesian networks (BN) [52], support vector regression (SVR) [53], genetic algorithms (GA) [54], genetic programming (GP) [55,56], association rules (AR) [57], rule induction (RI) [58], and fuzzy algorithms [59].…”
Section: Information Systems and Learning Softwarementioning
confidence: 99%
“…The learning method pseudocode is shown in Algorithm 1. The algorithm utilizes two thresholds named the Minimum Frequency (Min_Freq) and Rule Strength (R_S) as other Covering approaches such as Dynamic Rule Induction (DRI) 30,48 to find and extract the rules (Definitions 2 and 3, respectively). The Min_Freq threshold is used as a cutoff point for variables and class values in the training data (items).…”
Section: The Learning Covering Algorithmmentioning
confidence: 99%
“…This methodology suffers a major drawback due to the fact that a lot of time is required to tune the parameters and also a domain expert may be needed to decipher the dataset. Instead of the trial and error, an improved self-structuring NN anti-phishing model was proposed by Thabtah et al [50]. Their algorithm would update several parameters such as the learning rate in a more dynamic way prior to adding a new neuron to the hidden layer.…”
Section: Associative Classification (Ac)mentioning
confidence: 99%