2018
DOI: 10.1007/978-3-030-01054-6_22
|View full text |Cite
|
Sign up to set email alerts
|

Designing an Augmented Reality Multimodal Interface for 6DOF Manipulation Techniques

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 8 publications
0
8
0
1
Order By: Relevance
“…Ismail et al [ 9 ] merged voice and gestures in the context of virtual-reality interaction, making the user's operation of dealing with virtual items in an AR environment more natural. Kadavasal and Oliver [ 10 ] created a virtual-reality driving system for autistic individuals that included physiological signals, brain signals, and eye gaze information, to improve autistic patients' driving abilities.…”
Section: Related Workmentioning
confidence: 99%
“…Ismail et al [ 9 ] merged voice and gestures in the context of virtual-reality interaction, making the user's operation of dealing with virtual items in an AR environment more natural. Kadavasal and Oliver [ 10 ] created a virtual-reality driving system for autistic individuals that included physiological signals, brain signals, and eye gaze information, to improve autistic patients' driving abilities.…”
Section: Related Workmentioning
confidence: 99%
“…For instance, Ismail, Billinghurst [25] developed a multimodal AR application, and they used both motion and speech recognition for interaction. Chen, Li [26] also developed a mobile AR application for dog training.…”
Section: Related Workmentioning
confidence: 99%
“…One of the important aspects is creating appropriate interaction techniques for AR applications that allow end users to interact with virtual content in an intuitive way [12]. In AR, there are several categories of interaction technique and basically, interaction can be divided into three major parts: 1) tangible user interaction, 2) multimodal input [13] and 3) mobile interaction [14]. Since interaction is the wide topic, this paper focuses on handheld mobile interaction and entail 3D interaction in AR due to the rapid advent of the application on cell phones, and now smartphones and tablets, bringing AR almost to the mainstream [15].…”
Section: Handheld Mobile Ar Interfacementioning
confidence: 99%
“…In [63], callout display method is used to display the occluded area near the finger touch but the entire finger below the touch still occluded while Paudisch and Chu [65] suggested new hardware design to perform touch input behind the touchscreen display that not yet been launched in the consumer market. In [99], self-occlusion had been spatially solved by estimating the occluded hand and finger motions and gestures, but it is expensive due to the huge calculation requirement and also the limitation of the numbers of dataset according to FIGURE 13. Some examples of occlusion cases happened in the touch-based and mid-air gestures-based interaction techniques; a virtual object occluded by the user's fingers [60], b indirect touch to avoid occlusion [66], c i-virtual object on screen ii-virtual object occluded by finger [61], d Self-occlusions [91], [94], [96], [97], e Major features of AR marker had been occluded [95].…”
Section: Remaining Issues In 3d Object Manipulation Within Handheld Mobile Armentioning
confidence: 99%