2019 IEEE Aerospace Conference 2019
DOI: 10.1109/aero.2019.8741885
|View full text |Cite
|
Sign up to set email alerts
|

Continued Advances in Supervised Autonomy User Interface Design for METERON SUPVIS Justin

Abstract: The exploration of the universe remains a challenging endeavor, constantly pushing the limits of technology. Of special interest is the investigation of the other planets of our solar system such as Mars, which has been examined by various teleoperated and (semi-) autonomous satellites and landers. But an important milestone that is needed for a deeper understanding of the planet is still missing: A crewed landing. In order to send humans to such a remote location, an infrastructure for the landing crew includ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 20 publications
0
9
0
1
Order By: Relevance
“…This was especially observed in several space telerobotics experiments. In Kontur-2 and METERON SUPVIS Justin, both carried out from the International Space Station (ISS) to ground, DLR’s dexterous humanoid robot Justin ( Fuchs et al, 2009 ) was commanded using a 2-DOF force-reflection joystick ( Artigas et al, 2016 ) and task-driven supervised autonomy based GUI ( Schmaus et al, 2019 ), respectively, to perform a variety of dexterous robotic tasks. Although the ISS crew members in both experiments were able to successfully complete their given tasks, they have expressed the desire for different UI modalities to be available for more effective teleoperation ( Lii et al, 2018 ).…”
Section: Related Workmentioning
confidence: 99%
“…This was especially observed in several space telerobotics experiments. In Kontur-2 and METERON SUPVIS Justin, both carried out from the International Space Station (ISS) to ground, DLR’s dexterous humanoid robot Justin ( Fuchs et al, 2009 ) was commanded using a 2-DOF force-reflection joystick ( Artigas et al, 2016 ) and task-driven supervised autonomy based GUI ( Schmaus et al, 2019 ), respectively, to perform a variety of dexterous robotic tasks. Although the ISS crew members in both experiments were able to successfully complete their given tasks, they have expressed the desire for different UI modalities to be available for more effective teleoperation ( Lii et al, 2018 ).…”
Section: Related Workmentioning
confidence: 99%
“…Thus, a direct remote-control of the rover, e.g. by joystick from ground station as e.g.in [26], is not possible. Also, a live stream of large amounts of data, like stereo images, depth images or the obstacle map is impossible.…”
Section: Navigation System Architecturementioning
confidence: 99%
“…4) Human-Robot Interface: Recent works on haptic teleoperation systems for space missions have emphasised the need for intuitive, transparent, and safe control of remote systems [16], [17]. Aside from the control design aspects of a teleoperated system, these works consider enhanced feedback to operators through augmented display overlays.…”
Section: B Related Workmentioning
confidence: 99%
“…In addition to feedback systems, operator monitoring systems have been proposed as part of the Kontur-3 system [17], involving the use of head-mounted displays (HMD's) with gaze tracking features for evaluating operator cognitive load in ergonomics studies. A challenge with operator feedback systems is the limited space available in an in-orbit vessel, which can lead to unintuitive interfaces as designers attempt to fit interfaces to smaller displays [16]. A potential solution is to combine the mental load estimation capabilities of gazetracking HMDs with augmented reality interfaces to provide significantly greater flexibility in interface design for these systems, as explored in Section III-B.…”
Section: B Related Workmentioning
confidence: 99%