2019
DOI: 10.1080/23311916.2019.1599537
|View full text |Cite
|
Sign up to set email alerts
|

An image feature selection approach for dimensionality reduction based on kNN and SVM for AkT proteins

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 44 publications
(16 citation statements)
references
References 18 publications
0
16
0
Order By: Relevance
“…Arrangement is sorted as regulated and unsupervised characterization. The SVM classifier goes under the class of regulated machine learning and is seen to under perform at the premise of number-crunching ideas and factual hypothesis [25,26]. SVM classifier can characterize both straight and non-direct order data.…”
Section: Materials and Methodologymentioning
confidence: 99%
“…Arrangement is sorted as regulated and unsupervised characterization. The SVM classifier goes under the class of regulated machine learning and is seen to under perform at the premise of number-crunching ideas and factual hypothesis [25,26]. SVM classifier can characterize both straight and non-direct order data.…”
Section: Materials and Methodologymentioning
confidence: 99%
“…The SVM is implemented with a search grid using a sigmoid gamma function kernel for training, while the k -NN is implemented with number of neighbours of five and uniformly weighted functions for training [ 38 ]. There are several techniques for k -NN, such as KD-tree [ 39 ], fast library for approximate nearest neighbours (FLANN) [ 40 ], cover tree [ 41 ], and semi-convex hull tree [ 42 ]. However, this study implements k -NN using the brute-force, because of its competitive performance prior to neighbour search for small data samples.…”
Section: Experimental Validationmentioning
confidence: 99%
“…This consideration is important for our application since the COVID-19 image datasets are quite small (i.e., without augmentation). Furthermore, our choice is reinforced by the fact that brute-force k -NN exhibit high performance as presented in several image processing applications [ 34 , 42 , 43 ].…”
Section: Experimental Validationmentioning
confidence: 99%
“…Feature selection was applied in the cell apoptosis/survival dataset to achieve a good result by dividing it into three main categories, namely: wrapper method (WM), filtering method (FM), and Embedded Method (EM). After applying the feature selection (FS) algorithm, seven different marker proteins were obtained [42].…”
Section: Feature Selection-based Classification Algorithmsmentioning
confidence: 99%