2009
DOI: 10.1007/s10209-009-0145-4
|View full text |Cite
|
Sign up to set email alerts
|

A blueprint for integrated eye-controlled environments

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2010
2010
2016
2016

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…It allows interaction when the users gaze falls on a smart device. Authors in [12] presented another gazebased system for home automation. They compared "direct" and "mediated" interaction solutions.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…It allows interaction when the users gaze falls on a smart device. Authors in [12] presented another gazebased system for home automation. They compared "direct" and "mediated" interaction solutions.…”
Section: Introductionmentioning
confidence: 99%
“…Objects are often identified using RFID-tags [9,10] or using head-mounted sensors [11]. Some authors propose using object identification methods based on graphical features using cameras [12,13] and smartphones [14]. Different local detectors and descriptors were used for object detection in the context of interaction with smart objects.…”
Section: Introductionmentioning
confidence: 99%
“…There are some related works that also integrate the object recognition system with an eye tracking application, such as [Ishiguro et al 2010;Bonino et al 2009]. However, the evaluation of the benefits of the integration is not discussed deeply in these previous works.…”
Section: Introductionmentioning
confidence: 99%
“…For example, a control application based on gaze and eye tracking is described in [2] and [3]. This paper presents the results of an R&D project that uses Virtual Reality environments to create a 3D interface for residential gateways.…”
Section: Introductionmentioning
confidence: 99%