2016 IEEE VR 2016 Workshop on Perceptual and Cognitive Issues in AR (PERCAR) 2016
DOI: 10.1109/percar.2016.7562419
|View full text |Cite
|
Sign up to set email alerts
|

Effects of real-world backgrounds on user interface color naming and matching in automotive AR HUDs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…HUD reduces focal accommodation time (Merenda, Smith, Gabbard, Burnett, & Large, 2016), and also allows improving "eyes on the road" time by reducing the number of glances to the instrument cluster (Horrey, Wickens, & Consalus, 2006). Liu (2003) pointed that the HUD allows more time to scan the traffic scene, quicker reaction times to external road events, less mental stress for drivers, earlier detection of road obstacles, and easier learning phase to use.…”
Section: Heads Up Display (Huds)mentioning
confidence: 99%
“…HUD reduces focal accommodation time (Merenda, Smith, Gabbard, Burnett, & Large, 2016), and also allows improving "eyes on the road" time by reducing the number of glances to the instrument cluster (Horrey, Wickens, & Consalus, 2006). Liu (2003) pointed that the HUD allows more time to scan the traffic scene, quicker reaction times to external road events, less mental stress for drivers, earlier detection of road obstacles, and easier learning phase to use.…”
Section: Heads Up Display (Huds)mentioning
confidence: 99%
“…They found that the text legibility is highly affected by the text drawing style, the see-though background, and their interaction. Merenda et al [23] considered in-car HUD interfaces and conducted a study that investigates the user's color identification performance in different color-blending circumstances caused by the mixture in the colors of background and augmented text/symbolic content. They found that participants generally chose brighter colors as compared to the original source color of the content, but certain colors, e.g., blue, green, and yellow, were more accurately identified.…”
Section: Ar Reduced-contrast Scenariosmentioning
confidence: 99%
“…Pisanpeeti and Dinet 38 investigated the influence of shape and color features on the depth perception from a single still image and showed that color of transparent object could alter depth perception by giving an illusion of nearness when shape features were not prominent. The color naming and matching task in AR showed that blue, green, and yellow AR colors had very little hue shift 39 . However, the effect of users perceiving colors on the depth perception should still be explored.…”
Section: Related Workmentioning
confidence: 99%
“…The result of the color matching task in which participants verbally identified color shows that blue, green, and yellow were associated with higher accuracies 39 . Therefore, we chose three colors: blue (0, 0, 255), yellow (255, 255, 0), and green (0, 255, 0) in Experiment 1, as shown in Figure 3.…”
Section: Experiments 1: Color and Sizementioning
confidence: 99%