2016
DOI: 10.1002/rob.21681
|View full text |Cite
|
Sign up to set email alerts
|

Director: A User Interface Designed for Robot Operation with Shared Autonomy

Abstract: Operating a high degree of freedom mobile manipulator, such as a humanoid, in a field scenario requires constant situational awareness, capable perception modules, and effective mechanisms for interactive motion planning and control. A well‐designed operator interface presents the operator with enough context to quickly carry out a mission and the flexibility to handle unforeseen operating scenarios robustly. By contrast, an unintuitive user interface can increase the risk of catastrophic operator error by ove… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0
1

Year Published

2017
2017
2022
2022

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 42 publications
(30 citation statements)
references
References 35 publications
0
29
0
1
Order By: Relevance
“…Most of the existing approaches have achieved this goal by relying on teleoperation. [16][17][18][19][20][21][22] Supervisory steering and gas commands are sent to the robot to drive the car in Karumanchi et al 23 ; DeDonato and colleagues 24 propose a hybrid solution, with teleoperated steering and autonomous speed control. The velocity of the car, estimated with stereo cameras, is fed back to a proportional integral (PI) controller, whereas light imaging, detection, and ranging (LIDAR), IMU, and visual odometry data support the operator during the steering procedures.…”
Section: Problem Formulation and Proposed Approachmentioning
confidence: 99%
See 1 more Smart Citation
“…Most of the existing approaches have achieved this goal by relying on teleoperation. [16][17][18][19][20][21][22] Supervisory steering and gas commands are sent to the robot to drive the car in Karumanchi et al 23 ; DeDonato and colleagues 24 propose a hybrid solution, with teleoperated steering and autonomous speed control. The velocity of the car, estimated with stereo cameras, is fed back to a proportional integral (PI) controller, whereas light imaging, detection, and ranging (LIDAR), IMU, and visual odometry data support the operator during the steering procedures.…”
Section: Problem Formulation and Proposed Approachmentioning
confidence: 99%
“…Note, in fact, that even if the robot ankle moves in a small range, the car speed changes significantly. The noise on the ankle command, as well as the initial peak, are due to the derivative term of the gas pedal control (22). However, the signal is smoothed by the task-based QP control (see the dashed black line, i.e., the signal reconstructed by encoder readings), preventing jerky motion of the robot foot.…”
Section: First Experiment: Autonomous Car Drivingmentioning
confidence: 99%
“…The DIRECTOR interface and system architecture used to control the ATLAS robot, developed by Team MIT, is described in [8] and [9] respectively. It features a shared autonomy system backed by interactive, assisted perception, and trajectory optimization-based motion planning [10] where task sequences can be created from high-level motion primitives and constraints.…”
Section: A Backgroundmentioning
confidence: 99%
“…Our shared autonomy builds on the task execution system described in [8] which fills a sequence of task primitives with details acquired through operator-assisted perception online. The operator can review, pause, and amend the execution at any time, and the autonomous mode can be resumed immediately after a phase of manual operation.…”
Section: Shared Autonomymentioning
confidence: 99%
See 1 more Smart Citation