2014
DOI: 10.1007/978-3-319-14249-4_18
|View full text |Cite
|
Sign up to set email alerts
|

Image Classification via Semi-supervised Feature Extraction with Out-of-Sample Extension

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 16 publications
0
4
0
Order By: Relevance
“…Although the training dataset consists only of 720 images, HMS performs better than the semi-supervised feature extraction algorithm in Ref. 16. Thus, for a hierarchical partitioning of the training data with k = 2 and N 1 = 1, HMS reaches an average recognition rate (over ten random splits of the data) of 94.98% with 15 sensing values.…”
Section: Performance Of Hms Versus Other Methodsmentioning
confidence: 93%
See 3 more Smart Citations
“…Although the training dataset consists only of 720 images, HMS performs better than the semi-supervised feature extraction algorithm in Ref. 16. Thus, for a hierarchical partitioning of the training data with k = 2 and N 1 = 1, HMS reaches an average recognition rate (over ten random splits of the data) of 94.98% with 15 sensing values.…”
Section: Performance Of Hms Versus Other Methodsmentioning
confidence: 93%
“…In 2014 Dornaika et al 16 developed a semi-supervised feature extraction with an out-of-sample extension algorithm which they applied on a subset of the COIL-20 (18 images from 72 available for each object) database. They randomly selected 50% of the data as the training dataset and the rest as the test dataset.…”
Section: Performance Of Hms Versus Other Methodsmentioning
confidence: 99%
See 2 more Smart Citations