2018
DOI: 10.1002/dac.3541
|View full text |Cite
|
Sign up to set email alerts
|

Estimator learning automata for feature subset selection in high‐dimensional spaces, case study: Email spam detection

Abstract: One of the difficult challenges facing data miners is that algorithm performance degrades if the feature space contains redundant or irrelevant features. Therefore, as a critical preprocess task, dimension reduction is used to build a smaller space containing valuable features. There are 2 different approaches for dimension reduction: feature extraction and feature selection, which itself is divided into wrapper and filter approaches. In high-dimensional spaces, feature extrac-

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 57 publications
(63 reference statements)
0
4
0
Order By: Relevance
“…The theory of LA has found numerous applica tions in the field of computer science. One of the most recent applications of LA include sampling algorithms for stochastic graphs [19], trust propagation in online social networks [22], allocation hub location problem [4], selecting caching nodes in delay tolerant networks [9] and feature subset selection [23] to mention a few. For an updated overview over the theory and applications of LA we refer the reader to the following book [20] and to a recent special issue [21] dedicated to the applications of LA.…”
Section: Stochastic Nfek: State-of-the-artmentioning
confidence: 99%
“…The theory of LA has found numerous applica tions in the field of computer science. One of the most recent applications of LA include sampling algorithms for stochastic graphs [19], trust propagation in online social networks [22], allocation hub location problem [4], selecting caching nodes in delay tolerant networks [9] and feature subset selection [23] to mention a few. For an updated overview over the theory and applications of LA we refer the reader to the following book [20] and to a recent special issue [21] dedicated to the applications of LA.…”
Section: Stochastic Nfek: State-of-the-artmentioning
confidence: 99%
“…However, the model was theoretical and not tested on real data. Another interesting application was presented in [39]. The authors proposed estimator learning automaton-based subset selection as a new method for feature selection in highdimensional spaces.…”
Section: Related Workmentioning
confidence: 99%
“…Dimensional reduction is achieved with steps such as feature selection and feature extraction. The two steps, feature selection and feature extraction, though both aiming at reducing the number of features, are different in that feature extraction generates new variables while feature selection removes noises without creating new features (Seyyedi & Minaeibidgoli, 2018). The most common methods for feature selection include Term Frequency-Inverse Document Frequency (TF-IDF), Chi-square Statistics, Information Gain, and Mutual Information (Sabbah et al, 2017; Shah & Patel, 2016).…”
Section: Introductionmentioning
confidence: 99%
“…TC techniques are also important methods in knowledge management, such as content-based recommendation (Hawashin et al, 2019; Wijewickrema et al, 2019; Wu et al, 2020), patent classification (Kim et al, 2020), and information extraction (Al-Yahya, 2018). Besides, TC techniques have been frequently applied to the detection of unwanted messages, including short message spam, junk mails, and suspicious malignant mails (Ezpeleta et al, 2017; Hsiao & Chang, 2008; Mujtaba, Shuib, Raj, & Gunalan 2018; Seyyedi & Minaeibidgoli, 2018).…”
Section: Introductionmentioning
confidence: 99%