Advances in Visual Computing
DOI: 10.1007/978-3-540-76858-6_8
|View full text |Cite
|
Sign up to set email alerts
|

Robust Classification of Strokes with SVM and Grouping

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 6 publications
0
2
0
Order By: Relevance
“…Both techniques rely on fixed heuristics or parser rules and capture only a specific sketching domain. Nataneli & Faloutsos [12] and Zhou et al [22] present SVM-based workflows for stroke classification and grouping and letter segmentation in handwritten Japanese text, respectively. Except for the spatial structural approach of Nataneli & Faloutsos [12], all of the aforementioned works employ the creation history of a sketched scene for data organization and analysis.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Both techniques rely on fixed heuristics or parser rules and capture only a specific sketching domain. Nataneli & Faloutsos [12] and Zhou et al [22] present SVM-based workflows for stroke classification and grouping and letter segmentation in handwritten Japanese text, respectively. Except for the spatial structural approach of Nataneli & Faloutsos [12], all of the aforementioned works employ the creation history of a sketched scene for data organization and analysis.…”
Section: Related Workmentioning
confidence: 99%
“…Nataneli & Faloutsos [12] and Zhou et al [22] present SVM-based workflows for stroke classification and grouping and letter segmentation in handwritten Japanese text, respectively. Except for the spatial structural approach of Nataneli & Faloutsos [12], all of the aforementioned works employ the creation history of a sketched scene for data organization and analysis. Sezgin & Davis provide empirical evidence for per-user predictability of stroke orderings for recurring known objects [15].…”
Section: Related Workmentioning
confidence: 99%