2020
DOI: 10.1002/rob.21956
|View full text |Cite
|
Sign up to set email alerts
|

Visual model‐predictive localization for computationally efficient autonomous racing of a 72‐g drone

Abstract: Drone racing is becoming a popular e-sport all over the world, and beating the best human drone race pilots has quickly become a new major challenge for artificial intelligence and robotics. In this paper, we propose a novel sensor fusion method called visual model-predictive localization (VML). Within a small time window, VML approximates the error between the model prediction position and the visual measurements as a linear function. Once the parameters of the function are estimated by the RANSAC algorithm, … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 40 publications
(29 citation statements)
references
References 31 publications
0
29
0
Order By: Relevance
“…Classical proportional controllers were also used in the work [7]. In this case, they were arranged in a cascade scheme.…”
Section: B Vision-based Drone Controlmentioning
confidence: 99%
“…Classical proportional controllers were also used in the work [7]. In this case, they were arranged in a cascade scheme.…”
Section: B Vision-based Drone Controlmentioning
confidence: 99%
“…As an alternative to Kalman filtering, Li et al [33] propose a novel sensor fusion method called Visual Model-Predictive Localization (VML) that incorporates compensation of the delay in the visual information. Experimental results obtained with the VML show a good ability to deal with outliers, an advantage over a simple Kalman filtering.…”
Section: Perception Localization and Filteringmentioning
confidence: 99%
“…Besides that, specific autonomous navigation strategies for a racing drone are also discussed in the literature. Recent papers [24,36], and some previously commented [5,12,33] address methods of localization, control and planning for autonomous navigation in racing environments without obstacles.…”
Section: Solutions For Drone Racingmentioning
confidence: 99%
“…The runner-up team used a depth camera on board the drone to generate a local point cloud; the latter was aligned against a 3D CAD model of the race track, thus resolving the drone's localization; waypoints where located some distance before and after the gate for the drone to navigate towards them and to cross them; it must be said that this team achieved the fastest flight in terms of meters per second [1]. Opposite to the use of SLAM, one of the teams proposed a state machine where each state represented a flight behavior [15,16]. The behaviors were defined for different sections in the racetrack.…”
Section: Related Workmentioning
confidence: 99%