2018 21st International Conference on Intelligent Transportation Systems (ITSC) 2018
DOI: 10.1109/itsc.2018.8569917
|View full text |Cite
|
Sign up to set email alerts
|

Evaluation of Three In-Vehicle Interactions from Drivers' Driving Performance and Eye Movement behavior

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…We found that input sensed via nomadic devices was mainly kinesthetic and tactile. Kinesthetic input was performed on handheld devices, such as buttons on a smartphone [224], gaming controller [146], or wearables, e.g., smartwatch [131] and smart ring [118]. Similarly, tactile input was performed on touchscreen handheld devices, e.g., smartphone [260] or tablet [251], and a wearable smartwatch [285].…”
Section: Interaction Locationsmentioning
confidence: 99%
See 1 more Smart Citation
“…We found that input sensed via nomadic devices was mainly kinesthetic and tactile. Kinesthetic input was performed on handheld devices, such as buttons on a smartphone [224], gaming controller [146], or wearables, e.g., smartwatch [131] and smart ring [118]. Similarly, tactile input was performed on touchscreen handheld devices, e.g., smartphone [260] or tablet [251], and a wearable smartwatch [285].…”
Section: Interaction Locationsmentioning
confidence: 99%
“…Moreover, 30 publications used auditory, kinesthetic, or tactile modalities for multimodal input. Regarding auditory input, speech was used dominantly in combination with, e.g., button [224], gesture [289], emotion [411], touch [285], or heart rate [204]. Kinesthetic modalities were used diversely, e.g., combined with GSR [239], Skin Temp.…”
Section: Multimodal Interactionmentioning
confidence: 99%