Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications 2016
DOI: 10.1145/3003715.3005461
|View full text |Cite
|
Sign up to set email alerts
|

You Do Not Have to Touch to Select

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 24 publications
(14 citation statements)
references
References 21 publications
0
14
0
Order By: Relevance
“…Therefore, one of the most explored approaches was to rely on multimodal interfaces with an haptic component in the IS screen [8]- [13]. Other approaches focused on the design of the IS interface to facilitate in-vehicle interaction (e.g [14], [15]), which could for example predict the driver's input [16], or adapt the IS interface to the driver's intent [17]. The unique vehicle setting also provided opportunities for feedback, such as feedback in the steering wheel [18]- [20] or seat-belt [20].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, one of the most explored approaches was to rely on multimodal interfaces with an haptic component in the IS screen [8]- [13]. Other approaches focused on the design of the IS interface to facilitate in-vehicle interaction (e.g [14], [15]), which could for example predict the driver's input [16], or adapt the IS interface to the driver's intent [17]. The unique vehicle setting also provided opportunities for feedback, such as feedback in the steering wheel [18]- [20] or seat-belt [20].…”
Section: Related Workmentioning
confidence: 99%
“…Other authors explored techniques which implemented indirect interaction with the IS, where the driver can point at the screen and interact with it without touching it. Ahmad et al [16] developed a predictive system, using a motion controller that detects which item the user will select early in the pointing gesture. The study determined that the time to select a target was reduced by 39% compared to traditional touchscreens.…”
Section: B Interacting With the Ismentioning
confidence: 99%
“…Although studies on behavioral aspects of gestures for IVIS have been published recently, many of these studies have focused on gestures for touch-sensitive interfaces (e.g., Ahmad et al, 2016; Burnett et al, 2013; Ecker et al, 2010) or gestural interactions near or on the steering wheel (e.g., Angelini et al, 2014; Döring et al, 2011; Fang & Ainsworth, 2012; Lee et al, 2015; Mahr et al, 2011; Werner, 2014). Some works have investigated midair GBI from a user perspective, such as users’ preferences for gesture set design and feedback techniques (e.g., May et al, 2017; März et al, 2016; Riener et al, 2013; Shakeri et al, 2017).…”
Section: Introductionmentioning
confidence: 99%
“…The increased accuracy of the system might instead lead to general reduction of driver demands, since the correction of wrong selections would draw more of the user's attention. In this context, it has been shown that the enhancement of mid-air selection based on additional data results in reduced driver demands (Ahmad, Langdon, Godsill, Donkor, & Wilde, 2016).…”
Section: Discussionmentioning
confidence: 99%
“…Future work should focus on the development of a more elaborate fusion algorithm, in order to integrate gaze information based on a probabilistic model. Such approaches have been presented for the incorporation of vehicle data to optimize pointing and touch performance while driving (Ahmad et al, 2016;European Statement of Principles (ESoP), 2008;Mayer, Le, et al, 2018). The results are derived from a relatively specific setup, namely four large elements on a large screen.…”
Section: Limitationsmentioning
confidence: 99%