2019
DOI: 10.1016/j.autcon.2019.102847
|View full text |Cite
|
Sign up to set email alerts
|

Gesture and speech elicitation for 3D CAD modeling in conceptual design

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
20
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 26 publications
(23 citation statements)
references
References 28 publications
2
20
1
Order By: Relevance
“…This study extends previous works done on multimodal gesture and speech elicitation [27,37]. This extension is seen in the results reported and the methodology used.…”
Section: Gesture and Speech Studiessupporting
confidence: 85%
See 3 more Smart Citations
“…This study extends previous works done on multimodal gesture and speech elicitation [27,37]. This extension is seen in the results reported and the methodology used.…”
Section: Gesture and Speech Studiessupporting
confidence: 85%
“…This is different from our choice to examine each input individually. In both studies the referents were shown as animations, however, in this study participants were told that they were interacting with a system whereas Khan et al asked participants to describe the referents to another person via a video chat [27]. The use case of computer-aided design as well as the choice of observing interactions compared to referent descriptions is markedly different, with examples of the referents used there being extrude surface or pan.…”
Section: Gesture and Speech Studiesmentioning
confidence: 98%
See 2 more Smart Citations
“…The experiment was conducted to elicit speech and gesture preferences for conceptual CAD modeling from the stated professional groups. In related studies, we presented user preferences of gestures (Khan and Tuncer, n.d., 2017; Tunçer and Khan, 2018) and the implementation of a prototype (Khan et al ., 2017).…”
Section: Introductionmentioning
confidence: 99%