2013 IEEE Intelligent Vehicles Symposium (IV) 2013
DOI: 10.1109/ivs.2013.6629661
|View full text |Cite
|
Sign up to set email alerts
|

Affective interaction with a companion robot in an interactive driving assistant system

Abstract: Driving assistant systems are becoming the attractive service tasks in the field of intelligent robotics. Humans meet a variety of situations while driving cars and robotic systems help humans to understand how surrounding situations change and how robot systems are aware of given situations. From the viewpoint of interaction performance, proper situation awareness by a robotic assistant in a car and the relevant determination of corresponding reactions are crucial prerequisites for long term interaction betwe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0
1

Year Published

2013
2013
2022
2022

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(5 citation statements)
references
References 3 publications
0
4
0
1
Order By: Relevance
“…Most approaches in our synthesis used screens of various sizes and functions, e.g., [189,193,354,393]. However, we also found different visualization approaches, e.g., head-up displays (HUDs) [331], stereoscopic displays [392], active shutter glasses [49], robotic companion displays [406], projections [257], and shape changing devices [257]. Regarding HUDs, Beck and Park [31] investigated the perceived importance of different HUD information items, and Häuslschmid et al [130] evaluated the recognition of stimuli on a windshield HUD.…”
Section: Output Modalitiesmentioning
confidence: 99%
See 1 more Smart Citation
“…Most approaches in our synthesis used screens of various sizes and functions, e.g., [189,193,354,393]. However, we also found different visualization approaches, e.g., head-up displays (HUDs) [331], stereoscopic displays [392], active shutter glasses [49], robotic companion displays [406], projections [257], and shape changing devices [257]. Regarding HUDs, Beck and Park [31] investigated the perceived importance of different HUD information items, and Häuslschmid et al [130] evaluated the recognition of stimuli on a windshield HUD.…”
Section: Output Modalitiesmentioning
confidence: 99%
“…A concept by Broy et al [49] used shutter glasses to visualize a 3D effect on a dashboard display. Yang et al [406] presented a robotic companion display that automatically pans and tilts. Mok et al [257] projected the vehicle's automation status on the steering wheel while also changing its shape to visualize a TOR.…”
Section: Output Modalitiesmentioning
confidence: 99%
“…Ho [13] used the Haptic warning signals warning to interrupt driver distraction or redirect the focus on the direction that needs them immediate attention, while Choi et al [14] is about getting the driver characteristics as much as age, gender, driving ability to adjust the interface to the driver and provide cognitive assistance and continual follow-up throughthe camera and DAVIS ( Drive-Adaptive Vehicle Interaction System) that controls the interaction between the vehicle and the driver by monitoring driver's conditions and feedbacks. Yang et al [15] proposes an HMI based on virtual affective agent as an interactive robot system to estimate the driver status through the car operational commands. Park et al [16] suggests an assistant companion for events prediction from the online stream of sensory measurements by providing voice assistance.…”
Section: Related Workmentioning
confidence: 99%
“…The affective intelligent driving agent (AIDA) was introduced to approximate human's driving goal on behalf of conventional way point-based navigation [2]. We extended the human-robot interaction technology to the proposed driving assistant system toward an affective companion during long term period [3].…”
Section: Overview Of Robotic Driving Assistantmentioning
confidence: 99%