Teams of mobile robots will play a crucial role in future missions to explore the surfaces of extraterrestrial bodies. Setting up infrastructure and taking scientific samples are expensive tasks when operating in distant, challenging, and unknown environments. In contrast to current single-robot space missions, future heterogeneous robotic teams will increase efficiency via enhanced autonomy and parallelization, improve robustness via functional redundancy, as well as benefit from complementary capabilities of the individual robots. In this article, we present our heterogeneous robotic team, consisting of flying and driving robots that we plan to deploy on scientific sampling demonstration missions at a Moon-analogue site on Mt. Etna, Sicily, Italy in 2021 as part of the ARCHES project. We describe the robots' individual capabilities and their roles in two mission scenarios. We then present components and experiments on important tasks therein: automated task planning, high-level mission control, spectral rock analysis, radio-based localization, collaborative multi-robot 6D SLAM in Moon-analogue and Marslike scenarios, and demonstrations of autonomous sample return.
Planetary rovers increasingly rely on vision‐based components for autonomous navigation and mapping. Developing and testing these components requires representative optical conditions, which can be achieved by either field testing at planetary analog sites on Earth or using prerecorded data sets from such locations. However, the availability of representative data is scarce and field testing in planetary analog sites requires a substantial financial investment and logistical overhead, and it entails the risk of damaging complex robotic systems. To address these issues, we use our compact human‐portable DLR Sensor Unit for Planetary Exploration Rovers (SUPER) in the Moroccan desert to show resource‐efficient field testing and make the resulting Morocco‐Acquired data set of Mars‐Analog eXploration (MADMAX) publicly accessible. The data set consists of 36 different navigation experiments, captured at eight Mars analog sites of widely varying environmental conditions. Its longest trajectory covers 1.5 km and the combined trajectory length is 9.2 km. The data set contains time‐stamped recordings from monochrome stereo cameras, a color camera, omnidirectional cameras in stereo configuration, and from an inertial measurement unit. Additionally, we provide the ground truth in position and orientation together with the associated uncertainties, obtained by a real‐time kinematic‐based algorithm that fuses the global navigation satellite system data of two body antennas. Finally, we run two state‐of‐the‐art navigation algorithms, ORB‐SLAM2 and VINS‐mono, on our data to evaluate their accuracy and to provide a baseline, which can be used as a performance reference of accuracy and robustness for other navigation algorithms. The data set can be accessed at https://rmc.dlr.de/morocco2018.
The Earth's moon is currently an object of interest of many space agencies for unmanned robotic missions within this decade. Besides future prospects for building lunar gateways as support to human space flight, the Moon is an attractive location for scientific purposes. Not only will its study give insight on the foundations of the Solar System but also its location, uncontaminated by the Earth's ionosphere, represents a vantage point for the observation of the Sun and planetary bodies outside the Solar System. Lunar exploration has been traditionally conducted by means of single-agent robotic assets, which is a limiting factor for the return of scientific missions. The German Aerospace Center (DLR) is developing fundamental technologies towards increased autonomy of robotic explorers to fulfil more complex mission tasks through cooperation. This paper presents an overview of past, present and future activities of DLR towards highly autonomous systems for scientific missions targeting the Moon and other planetary bodies. The heritage from the Mobile Asteroid Scout (MASCOT), developed jointly by DLR and CNES and deployed on asteroid Ryugu on 3 October 2018 from JAXA's Hayabusa2 spacecraft, inspired the development of novel core technologies towards higher efficiency in planetary exploration. Together with the lessons learnt from the ROBEX project (2012–2017), where a mobile robot autonomously deployed seismic sensors at a Moon analogue site, this experience is shaping the future steps towards more complex space missions. They include the development of a mobile rover for JAXA's Martian Moons eXploration (MMX) in 2024 as well as demonstrations of novel multi-robot technologies at a Moon analogue site on the volcano Mt Etna in the ARCHES project. Within ARCHES, a demonstration mission is planned from the 14 June to 10 July 2021, 1 during which heterogeneous teams of robots will autonomously conduct geological and mineralogical analysis experiments and deploy an array of low-frequency antennas to measure Jovian and solar bursts. This article is part of a discussion meeting issue ‘Astronomy from the Moon: the next decades'.
In this paper, we present a novel algorithm for tracking cells in time lapse confocal microscopy movie of a Drosophila epithelial tissue during pupal morphogenesis. We consider a 2D + time video as a 3D static image, where frames are stacked atop each other, and using a spatio-temporal segmentation algorithm we obtain information about spatio-temporal 3D tubes representing evolutions of cells. The main idea for tracking is the usage of two distance functions--first one from the cells in the initial frame and second one from segmented boundaries. We track the cells backwards in time. The first distance function attracts the subsequently constructed cell trajectories to the cells in the initial frame and the second one forces them to be close to centerlines of the segmented tubular structures. This makes our tracking algorithm robust against noise and missing spatio-temporal boundaries. This approach can be generalized to a 3D + time video analysis, where spatio-temporal tubes are 4D objects.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.