2018
DOI: 10.1109/mra.2018.2815655
|View full text |Cite
|
Sign up to set email alerts
|

Better Teaming Through Visual Cues: How Projecting Imagery in a Workspace Can Improve Human-Robot Collaboration

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 53 publications
(25 citation statements)
references
References 21 publications
0
25
0
Order By: Relevance
“…projectors integrated into the smart environment could visualize the user-defined virtual borders directly on the ground without change of attention. Although there are already solutions in the industrial context [14,24], projectors are not yet widely distributed in current smart homes. Moreover, we currently evaluated our interaction method with a sin-gle mobile robot, which is valid for most households.…”
Section: Discussionmentioning
confidence: 99%
“…projectors integrated into the smart environment could visualize the user-defined virtual borders directly on the ground without change of attention. Although there are already solutions in the industrial context [14,24], projectors are not yet widely distributed in current smart homes. Moreover, we currently evaluated our interaction method with a sin-gle mobile robot, which is valid for most households.…”
Section: Discussionmentioning
confidence: 99%
“…x − y 2 (7) We utilise this functionality to calculate the collision vector to both HMIs in each moment of the trajectory execution and publish it to other ROS nodes. Each observed HMI is simplistically represented as a spherical collision object.…”
Section: Motion Planningmentioning
confidence: 99%
“…Better user awareness can be provided by utilising notification devices (humanmachine interfaces, HMIs) that allow to understand motion plans and status of the robot. In order to convey information, these systems may utilise the primary sensory modalities of a human: vision (monitors [6], light projectors [7,8], mixed reality devices [9][10][11]), hearing (sound alerts and speech notifications [2,12]), touch (tactile feedback devices [4,[13][14][15]). Touch modality represents a robust and direct way of transferring information to the user, making it suitable to convey information to workers in industrial environments, where visual and auditory modalities might be busy or blocked.…”
Section: Introductionmentioning
confidence: 99%
“…Recent works in this area include approaches using AR for robot design [35], calibration [42], and training [46]. Moreover, there are a number of approaches towards communicating robots' perspectives [20], intentions [2,9,10,12,15], and trajectories [14,31,36,54].…”
Section: Related Work 21 Ar For Hrimentioning
confidence: 99%