2017
DOI: 10.1016/j.patcog.2016.07.018
|View full text |Cite
|
Sign up to set email alerts
|

Recognizing arbitrarily connected and superimposed handwritten numerals in intangible writing interfaces

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(2 citation statements)
references
References 36 publications
0
2
0
Order By: Relevance
“…The work states that an accuracy of 99.76% and 95.22% has been obtained for printed and handwritten numerals, respectively. Segmenting and recognizing arbitrarily connected and superimposed handwritten numeral recognition in onestroke finger gestures has been a problem and a solution to this has been proposed by Chiang et al, [3]. The method has two phases, key numeral spotting (KNS) phase and recognition by concatenation (RBC) phase.…”
Section: -Literature Surveymentioning
confidence: 99%
“…The work states that an accuracy of 99.76% and 95.22% has been obtained for printed and handwritten numerals, respectively. Segmenting and recognizing arbitrarily connected and superimposed handwritten numeral recognition in onestroke finger gestures has been a problem and a solution to this has been proposed by Chiang et al, [3]. The method has two phases, key numeral spotting (KNS) phase and recognition by concatenation (RBC) phase.…”
Section: -Literature Surveymentioning
confidence: 99%
“…Recognition methodologies are using the captured hand or arm position trajectories as features and Neuronal Networks (NNs) or Hidden Markov Models (HMMs) (see Section 4 ) as classifiers to extract human-readable text. Another approach uses video-based hand tracking to recognise characters in one-stroke finger gestures [ 99 ]. Chang et al [ 100 ] proposed a framework for mid-air handwriting recognition which is intended for use with wearable egocentric cameras.…”
Section: Applications and Contextsmentioning
confidence: 99%