2016 International Conference on Engineering &Amp; MIS (ICEMIS) 2016
DOI: 10.1109/icemis.2016.7745366
|View full text |Cite
|
Sign up to set email alerts
|

Review on wrapper feature selection approaches

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
71
0
2

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 126 publications
(73 citation statements)
references
References 13 publications
0
71
0
2
Order By: Relevance
“…Dataset format that supports LIBSVM is illustrated in Figure 4. Furthermore, ILFS and Relief are implemented using feature selection library called FSLib [10], [21].…”
Section: B Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Dataset format that supports LIBSVM is illustrated in Figure 4. Furthermore, ILFS and Relief are implemented using feature selection library called FSLib [10], [21].…”
Section: B Methodsmentioning
confidence: 99%
“…The weights can be calculated with probability co-occurrences between features and tokens using PLSA technique [16] Finally, the weight is optimized using the Expectation-Maximization (EM) algorithm. Further details about ILFS can refer to [10]. The detailed process of ILFS technique is described as follow:…”
Section: B Infinite Latent Feature Selection (Ilfs)mentioning
confidence: 99%
See 1 more Smart Citation
“…On the contrary, if l approaches t the system may miss an occupied room. We Algorithm 1 Location-Aware HMM Z ← Z∪ Viterbi(HMM) 9: end for have empirically seen that for l = 1 3 * t (line 3), we get the best accuracy. Given the feature set x t of the occupied rooms, the algorithm calculates the new emission probability (line 4).…”
Section: Occupancy Estimation Algorithmmentioning
confidence: 99%
“…Feature selection techniques are important to select features for good classification results. Previous works on feature selection were based on filter or wrapper methods [26][27][28]. However, the methods need to process a large number of data to find out appropriate features, which is hard to achieve in real-time and low-cost platform.…”
Section: Feature Selection and Extractionmentioning
confidence: 99%