2021
DOI: 10.48550/arxiv.2110.11075
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Enabling a Social Robot to Process Social Cues to Detect when to Help a User

Abstract: It is important for socially assistive robots to be able to recognize when a user needs and wants help. Such robots need to be able to recognize human needs in a real-time manner so that they can provide timely assistance. We propose an architecture that uses social cues to determine when a robot should provide assistance. Based on a multimodal fusion approach upon eye gaze and language modalities, our architecture is trained and evaluated on data collected in a robot-assisted Lego building task. By focusing o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…We subdivide assistive explanations into task-oriented assistive explanations and general knowledge explanations. Taskoriented explanations are about helping the user in situations where they are stuck due to missing knowledge about the task and environment (e.g., [31]) or due to high task complexity that does not allow them to quickly identify the best action (e.g., [32]). An example of a question coming from the user that demands a task-oriented assistive explanation is "What do I have to do next?".…”
Section: Assistive Explanationsmentioning
confidence: 99%
See 1 more Smart Citation
“…We subdivide assistive explanations into task-oriented assistive explanations and general knowledge explanations. Taskoriented explanations are about helping the user in situations where they are stuck due to missing knowledge about the task and environment (e.g., [31]) or due to high task complexity that does not allow them to quickly identify the best action (e.g., [32]). An example of a question coming from the user that demands a task-oriented assistive explanation is "What do I have to do next?".…”
Section: Assistive Explanationsmentioning
confidence: 99%
“…The user does not understand the agent's behavior, task, or environment as some piece of knowledge about them is missing from the user's mental model [9,31]. Filling in those missing pieces via explanations helps with solving the task and predicting and understanding agent behavior, thus generating trust [18] The user does not understand the task goals, what preconditions need to be fulfilled to execute some action, or where an object is Mismatch of the user's and the agent's mental model User and agent model differ in regards to the environment, task, or each other.…”
Section: Incomplete User Mental Model Of the Agent Or Taskmentioning
confidence: 99%