2012 IEEE Symposium on 3D User Interfaces (3DUI) 2012
DOI: 10.1109/3dui.2012.6184189
|View full text |Cite
|
Sign up to set email alerts
|

Virtual exertions: A user interface combining visual information, kinesthetics and biofeedback for virtual object manipulation

Abstract: Virtual Reality environments have the ability to present users with rich visual representations of simulated environments. However, means to interact with these types of illusions are generally unnatural in the sense that they do not match the methods humans use to grasp and move objects in the physical world. We demonstrate a system that enables users to interact with virtual objects with natural body movements by combining visual information, kinesthetics and biofeedback from electromyograms (EMG). Our metho… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 13 publications
0
8
0
Order By: Relevance
“…Visual cues regarding the weight of an object and the remembered internal models that it stimulates also influence muscle activity [16]. Within a virtual reality environment, realistic grasping movements without physical interaction could only be achieved if thresholds were based on actual physical interactions, which were higher and required calibration for each person [17]. While the literature highlights many important features related to grasping, to our knowledge, anticipatory signals prior to object release has not been shown.…”
Section: Introductionmentioning
confidence: 97%
“…Visual cues regarding the weight of an object and the remembered internal models that it stimulates also influence muscle activity [16]. Within a virtual reality environment, realistic grasping movements without physical interaction could only be achieved if thresholds were based on actual physical interactions, which were higher and required calibration for each person [17]. While the literature highlights many important features related to grasping, to our knowledge, anticipatory signals prior to object release has not been shown.…”
Section: Introductionmentioning
confidence: 97%
“…In many applications this form of interaction is preferable due to the discomfort of wearable devices (Suzuki et al [1]) and the often time consuming configuration and user adaptation (Holz et al [2]) of them. Moreover, recent studies notably Ponto et al [3], have illustrated that wearable methods of user feedback, notably biofeedback or electromyograms (EMG), can aid in human grasping, but often cause fatigue and discomfort. Early developments of freehand interaction largely ignored the application of complex human grasps.…”
Section: Introductionmentioning
confidence: 99%
“…According to Ponto et al [151] and Chen et al [86], the exertion force scales linearly with objects' mass, and a minimum exertion force value can be associated with the effort required to grasp and lift an object. The research utilized the biofeedback from electromyograms (EMG) with the combination of visual effects to simulate the weight perception when manipulating virtual objects in a CAVE environment.…”
Section: ) Passive Forcementioning
confidence: 99%
“…Latency in VR can be caused by delays originating from the sensors, delays incurred during processing, delays incurred during data transmission, delays incurred from the process of data smoothing, delays incurred from the rendering process, and delays caused by frame rate drops [165]. This issue was seen in [151] and [87] where latency caused a distortion effect (e.g., objects appeared to be stuck in the hand) which ultimately may have decreased the sense of immersion and realism.…”
Section: F Synchronicitymentioning
confidence: 99%