Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology 2019
DOI: 10.1145/3332165.3347902
|View full text |Cite
|
Sign up to set email alerts
|

GhostAR: A Time-space Editor for Embodied Authoring of Human-Robot Collaborative Task with Augmented Reality

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
45
0
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 61 publications
(46 citation statements)
references
References 30 publications
0
45
0
1
Order By: Relevance
“…A spatial programming by demonstration (PBD) called GhostAR was developed in Cao et al (2019) , which captures the real-time motion of the human, feeds it to a dynamic time warping (DTW) algorithm which maps it to an authored human motion, and outputs corresponding robot actions in a human-lead robot-assist scenario. The captured human motions and the corresponding robot actions are saved and visualized to the user who can observe the complete demonstration with saved AR ghosts of both the human and robot and interactively perform edits on robot actions to clarify user intent.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…A spatial programming by demonstration (PBD) called GhostAR was developed in Cao et al (2019) , which captures the real-time motion of the human, feeds it to a dynamic time warping (DTW) algorithm which maps it to an authored human motion, and outputs corresponding robot actions in a human-lead robot-assist scenario. The captured human motions and the corresponding robot actions are saved and visualized to the user who can observe the complete demonstration with saved AR ghosts of both the human and robot and interactively perform edits on robot actions to clarify user intent.…”
Section: Discussionmentioning
confidence: 99%
“…In Cao et al (2019) , the human motion is captured through the AR elements (Oculus Rift and two Oculus Touch Controllers) and saved as ghost holograms. Dynamic Time Warping is used to infer the human motion in real time from a previously compiled list of groups that represent human authorized motions.…”
Section: Discussionmentioning
confidence: 99%
“…Buildings recede as you walk 'back' through time, forests wind back fire damage, glaciers melt or rise as carbon-breathing humanity approaches and so on. Past thought experiments, nascent experiments in AR interfaces of time explore 'dynamic time warping' (DTW) (Cao et al, 2019) that combines human interaction with repeatable robot motion to achieve 'adaptive collaboration' exist.…”
Section: Four-dimensional Place(ment)mentioning
confidence: 99%
“…When the desired requirements are achieved, the authoring process is finished and the robot is ready to be deployed to interact autonomously. Authoring differs from classic programming in its focus on end users with limited background in computer sciences and seeks to address questions of how can these users design, or author, behaviors using modalities such as tangible interactions ( Sefidgar et al, 2017 ; Huang and Cakmak, 2017 ), natural language ( Walker et al, 2019 ), augmented- or mixed-reality ( Cao et al, 2019a ; Peng et al, 2018 ; Akan et al, 2011 ; Gao and Huang, 2019 ), visual programming environments ( Glas et al, 2016 ; Paxton et al, 2017 ), or a mixture of modalities ( Huang and Cakmak, 2017 ; Porfirio et al, 2019 ). Steinmetz et al (2018) describe task-level programming as parameterizing and sequencing predefined skills composed of primitives to solve a task at hand.…”
Section: Related Workmentioning
confidence: 99%