2009
DOI: 10.1109/vr.2009.4811003
|View full text |Cite
|
Sign up to set email alerts
|

Eye Tracking for Avatar Eye Gaze Control During Object-Focused Multiparty Interaction in Immersive Collaborative Virtual Environments

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2009
2009
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(19 citation statements)
references
References 28 publications
0
19
0
Order By: Relevance
“…Therefore, the model automatically adjusts to the control method. Hence, if gaze is driven by an eye tracker in real-time as presented in Steptoe et al 36 , the model will respond according to the wearer's gaze behaviour, which is likely to reflect their current emotional or mood state: increased and rapid eye saccades will be matched with lively lid saccades, thereby communicating vital nonverbal information to an observer or interactional partner in avatar-mediated communication. Similarly, negative or submissive traits of avoiding eye contact and frequently looking downwards is likely to be exaggerated by the corresponding model-generated lid saccades, which naturally display the closed-state of the eyes.…”
Section: Discussionmentioning
confidence: 99%
“…Therefore, the model automatically adjusts to the control method. Hence, if gaze is driven by an eye tracker in real-time as presented in Steptoe et al 36 , the model will respond according to the wearer's gaze behaviour, which is likely to reflect their current emotional or mood state: increased and rapid eye saccades will be matched with lively lid saccades, thereby communicating vital nonverbal information to an observer or interactional partner in avatar-mediated communication. Similarly, negative or submissive traits of avoiding eye contact and frequently looking downwards is likely to be exaggerated by the corresponding model-generated lid saccades, which naturally display the closed-state of the eyes.…”
Section: Discussionmentioning
confidence: 99%
“…Here in the physical level controlling collision detection mechanism is used. This method help to avoid stuck on any obstacle [15].…”
Section: Autonomous Motion Controlmentioning
confidence: 99%
“…Researchers have explored a wide variety of devices and techniques for avatar control, including using the mouse [15], keyboard [21], gamepad [24], data glove [8], head and hand tracker [2], eye tracker [4,22], hand gestures [14], full body actions [20], and brain activities [16,23]. Some of these control methods are intended for immersive virtual environments with head-mounted displays; we limit our scope to desktop CVEs, where users look at the virtual world through a computer monitor.…”
Section: Controlling Avatars In Desktop Cvesmentioning
confidence: 99%