2011 IEEE Symposium on 3D User Interfaces (3DUI) 2011
DOI: 10.1109/3dui.2011.5759222
|View full text |Cite
|
Sign up to set email alerts
|

Pointing at 3D targets in a stereo head-tracked virtual environment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
80
1
5

Year Published

2013
2013
2023
2023

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 109 publications
(93 citation statements)
references
References 15 publications
5
80
1
5
Order By: Relevance
“…Moreover, it has been show that such tactile (or pen-based) interaction suffers from the parallax between the two images that are shown for both of the eyes [13,14,24,82,83]. In addition, touch-through [19,78,81] and invisible wall problems make such an interaction setup problematic. Only in situations when the element to be accessed by the surfacebased input is at a close distance to input screen do users perceive their input to directly control the manipulated elements [83].…”
Section: D Stereoscopic Viewing Of 3d Data Visualizationsmentioning
confidence: 99%
“…Moreover, it has been show that such tactile (or pen-based) interaction suffers from the parallax between the two images that are shown for both of the eyes [13,14,24,82,83]. In addition, touch-through [19,78,81] and invisible wall problems make such an interaction setup problematic. Only in situations when the element to be accessed by the surfacebased input is at a close distance to input screen do users perceive their input to directly control the manipulated elements [83].…”
Section: D Stereoscopic Viewing Of 3d Data Visualizationsmentioning
confidence: 99%
“…In the monoscopic case the mapping between an on-surface touch point and the intended object point in the virtual scene is straightforward, but with stereoscopic projection this mapping introduces problems [48]. To enable direct 3D "touch" selection of stereoscopically displayed objects in space, 3D tracking technologies can capture a user's hand or finger motions in front of the display surface.…”
Section: Touch Screen Interaction and S3dmentioning
confidence: 99%
“…Recent studies of Teather et al [12,13] deal with pointing task evaluation in a 3D environment based on the ISO standard. In the first study [12], targets are spheres disposed on a 2D circle on a vertical plane.…”
Section: Mtmentioning
confidence: 99%
“…In the first study [12], targets are spheres disposed on a 2D circle on a vertical plane. In the second study, the representation used is a 3D scene [13] in which targets are circles placed on cylinders. In both cases, due to the perspective rendering mode, targets representation appears on a plane, either horizontal or vertical: selecting a target thus results of the combination of a 2D pointing on the plane with the use of a ray-casting technique to reach the appropriate depth.…”
Section: Mtmentioning
confidence: 99%