2013
DOI: 10.1007/978-3-642-38679-4_42
|View full text |Cite
|
Sign up to set email alerts
|

Texture Classification Using Kernel-Based Techniques

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2013
2013
2016
2016

Publication Types

Select...
4
1

Relationship

3
2

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 23 publications
0
6
0
Order By: Relevance
“…These images are from the dataset owned by G.-Z. Yang 44 (Imperial College of Science, Technology and Medicine, London) and have been used in several publications 2 45 46 .…”
Section: Methodsmentioning
confidence: 99%
“…These images are from the dataset owned by G.-Z. Yang 44 (Imperial College of Science, Technology and Medicine, London) and have been used in several publications 2 45 46 .…”
Section: Methodsmentioning
confidence: 99%
“…GAs for feature selection were first proposed by Siedlecki and Sklansky [ 52 ]. Many studies have been done on GA for feature selection since then [ 6 , 53 ], concluding that GA is suitable for finding optimal solutions to large problems with more than 40 features to select from. GA for feature selection could be used in combination with a classifier such SVM, optimizing it.…”
Section: Methodsmentioning
confidence: 99%
“…There have been several studies that have opted for the combination of evolutionary methods with classification mechanisms [ 6 , 13 , 15 – 17 ]. However, even after having developed very efficient models, ANNs have the disadvantage of undergoing training processes on a strong stochastic basis, leading to nonrepeatability of the process.…”
Section: State Of the Artmentioning
confidence: 99%
See 1 more Smart Citation
“…The hyperplane separates the "positive" from the "negative" examples, so that the distance between the boundary and the nearest data point in each class is maximal; these data points are used to define the margins, and are called support vectors (Burges, 1998). SVMs have proved to be exceptionally efficient in classification problems of higher dimensionality (Moulin et al, 2004;Chapelle et al, 1999;Fernandez-Lozano et al, 2013a), because of their ability to generalize in high-dimensional spaces. SVM uses different non-linear kernel functions, like polynomial, sigmoid and radial basis function, where the non-linear SVM maps the training samples from the input spaces into a higher-dimensional feature space via a mapping function (Burges, 1998).…”
Section: Svmmentioning
confidence: 99%