2010 4th International Conference on Signal Processing and Communication Systems 2010
DOI: 10.1109/icspcs.2010.5709716
|View full text |Cite
|
Sign up to set email alerts
|

Vowel recognition from continuous articulatory movements for speaker-dependent applications

Abstract: Abstract-A novel approach was developed to recognize vowels from continuous tongue and lip movements. Vowels were classified based on movement patterns (rather than on derived articulatory features, e.g., lip opening) using a machine learning approach. Recognition accuracy on a single-speaker dataset was 94.02% with a very short latency. Recognition accuracy was better for high vowels than for low vowels. This finding parallels previous empirical findings on tongue movements during vowels. The recognition algo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
2
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 26 publications
1
2
0
Order By: Relevance
“…The data presented in the classification matrices (Tables 2 and 3) and the distance matrix (Table 4) for vowels indicated that / /, / /, / /, and / / were easier to distinguish than were / /, / /, / /, and / /. This result supports the previous findings that low tongue vowels (e.g., / /) have more articulatory variation than high tongue vowels (e.g., / / and / /, see Perkell & Cohen, 1989; Wang, Green, Samal, & Carrell, 2010). More specifically, our results suggest high and front vowels (i.e., / /, / /, / /, and / /) are more articulatory distinct than low and back vowels (i.e., / /, / /, / /, and / /).…”
Section: Discussionsupporting
confidence: 86%
See 1 more Smart Citation
“…The data presented in the classification matrices (Tables 2 and 3) and the distance matrix (Table 4) for vowels indicated that / /, / /, / /, and / / were easier to distinguish than were / /, / /, / /, and / /. This result supports the previous findings that low tongue vowels (e.g., / /) have more articulatory variation than high tongue vowels (e.g., / / and / /, see Perkell & Cohen, 1989; Wang, Green, Samal, & Carrell, 2010). More specifically, our results suggest high and front vowels (i.e., / /, / /, / /, and / /) are more articulatory distinct than low and back vowels (i.e., / /, / /, / /, and / /).…”
Section: Discussionsupporting
confidence: 86%
“…For example, additional research is required to determine if classification accuracy is a sensitive metric for quantifying the severity of speech impairment or the articulatory changes that occur under different speaking conditions (Mefferd & Green, 2010). In addition, further work is planned to determine if the classification approaches are suitable as the recognition engine for silent speech interfaces (Denby et al 2010; Fagan et al, 2008; Hueber et al, 2010; Wang, Green, Samal, & Carrell, 2010; Wang, Samal, Green, & Rudzicz, 2012a, 2012b) to facilitate oral communication in patients with moderate to severe speech or voice impairments. Finally, although only female talkers were investigated, we anticipate that the classification of male talkers' vowels and consonants would produce similar results.…”
Section: Discussionmentioning
confidence: 99%
“…Practical implications of these data have to do with the development of technologies for speech recognition and visual augmented articulatory feedback. The tongue trajectories containing location information are beginning to be successfully used in speech recognition algorithms based on kinematic signals, which aim to enhance existing acoustic-based recognition models (Wang et al, 2010). Additionally, positional target similar to ones shown here have been used to treat place of articulation in motor speech disorders (see Katz, Bharadwji, & Carstens, 1999;McNeil et al, 2010).…”
Section: E Implicationsmentioning
confidence: 91%