2018
DOI: 10.1007/s10514-018-9746-1
|View full text |Cite
|
Sign up to set email alerts
|

Planning for cars that coordinate with people: leveraging effects on human actions for planning and active information gathering over human internal state

Abstract: Traditionally, autonomous cars treat human-driven vehicles like moving obstacles. They predict their future trajectories and plan to stay out of their way. While physically safe, this results in defensive and opaque behaviors. In reality, an autonomous car's actions will actually affect what other cars will do in response, creating an opportunity for coordination. Our thesis is that we can leverage these responses to plan more efficient and communicative behaviors. We introduce a formulation of interaction wit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
169
0
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2
2
1

Relationship

3
6

Authors

Journals

citations
Cited by 165 publications
(171 citation statements)
references
References 40 publications
1
169
0
1
Order By: Relevance
“…Legible motions are motions that express the robot's intent to observers, and can be different from predictable motions [26]. Sadigh et al modeled a human-robot system jointly as a dynamical system, and proposed methods to plan motions that actively influence human behaviors [28] and further actively gather information of human's internal state [29], [30]. Similar ideas were explored in human-robot collaboration scenarios.…”
Section: B Expressive Motion In Roboticsmentioning
confidence: 99%
See 1 more Smart Citation
“…Legible motions are motions that express the robot's intent to observers, and can be different from predictable motions [26]. Sadigh et al modeled a human-robot system jointly as a dynamical system, and proposed methods to plan motions that actively influence human behaviors [28] and further actively gather information of human's internal state [29], [30]. Similar ideas were explored in human-robot collaboration scenarios.…”
Section: B Expressive Motion In Roboticsmentioning
confidence: 99%
“…In Eqs. (30) and (31), the reward at each time step is divided by show an example where initially the prediction doesn't match the measurement. We use the model to re-predict human actions at each time step, and the prediction starts to match measurement after two seconds.…”
Section: A Social Navigation Scenariomentioning
confidence: 99%
“…In order to identify communicative actions, we need a cognitive model of how the human learns: the human might respond to the robot's actions [8], infer the robot's objective [9], or follow a personalized learning strategy [10]. Our research builds upon these prior works [7]- [11] by identifying communicative actions for a trusting human model. Unlike research that explicitly generates deceptive actions [12], here the robot naturally selects misleading communication.…”
Section: Related Workmentioning
confidence: 99%
“…Failure of a HAV to successfully interact with human road users may lead to congestion and human frustration, as a result of overly cautious behaviour on the part of the HAV (Millard-Ball 2016; Brown and Laurie 2017), or may even lead to crashes, if HAVs behave in ways that are unexpected by human road users (Alambeigi, McDonald, and Tankasala 2020). Improved understanding and models-both qualitative and quantitative-of how humans interact in traffic is a key prerequisite for vehicle manufacturers and software developers to program HAVs to successfully interact with humans (Camara et al 2019;Markkula et al 2018;Sadigh et al 2018;Schwarting et al 2019). This adds urgency to previously existing interaction-related research questions, and also introduces new research questions specific to human-machine interaction, for example whether and how eye contact or other human communicative gestures ought to be replaced with external human-machine interfaces Clamann 2015;Cefkin et al 2019).…”
Section: Introductionmentioning
confidence: 99%