2014
DOI: 10.1016/j.cad.2013.08.039
|View full text |Cite
|
Sign up to set email alerts
|

GaFinC: Gaze and Finger Control interface for 3D model manipulation in CAD application

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
35
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 37 publications
(35 citation statements)
references
References 7 publications
0
35
0
Order By: Relevance
“…CAD/CAE software needs to be engaging to the end-user so that it can consolidate immersion. In the single workstation domain, natural and intuitive interfaces for CAD modelling, such as hand gesture controls, have received a lot of attention recently (Song, Cho, Baek, Lee, & Bang, 2014). Also referred to as virtual reality (VR)-based CAD systems, allow the user to 'grab' an object and manipulate it, as if holding it in his/her hand-physically, providing users with a more natural and intuitive method of interaction.…”
Section: Remote Visualisation Of 3d Modelsmentioning
confidence: 99%
“…CAD/CAE software needs to be engaging to the end-user so that it can consolidate immersion. In the single workstation domain, natural and intuitive interfaces for CAD modelling, such as hand gesture controls, have received a lot of attention recently (Song, Cho, Baek, Lee, & Bang, 2014). Also referred to as virtual reality (VR)-based CAD systems, allow the user to 'grab' an object and manipulate it, as if holding it in his/her hand-physically, providing users with a more natural and intuitive method of interaction.…”
Section: Remote Visualisation Of 3d Modelsmentioning
confidence: 99%
“…To train the system HMM is used resulting 94.8% accuracy. Song et al [12] have proposed a multi-modal interface to control a 3D computer aided design (CAD) models using the finger movement and eye gaze motion. There are many other methods and techniques that use"s hand data glove for more efficient and accurate interaction and interface [16][17][18][19][20] but due to space limitation of this paper they are not mentioned in detail here.…”
Section: Realitymentioning
confidence: 99%
“…Dragon Dictation) supplied with sentence syntax analysis and adaptive machine learning. e) Packages supporting Microsoft Kinect and Creative Gesture Camera to control HRT described in our former paper [14][15] are still compatible with HRT and can be used for some special purposes. However, the ultra-fast progress in NUI sensors makes them not competitive with latest devices.…”
Section: Nui Hardwarementioning
confidence: 99%
“…Such multi-modal approach (exploited e.g. in CAD system [14]) can significantly speed up the dwell-time between the action of the operator and the reaction of the system.…”
Section: Introductionmentioning
confidence: 99%