2020
DOI: 10.1126/scirobotics.aaz9712
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic obstacle avoidance for quadrotors with event cameras

Abstract: Today’s autonomous drones have reaction times of tens of milliseconds, which is not enough for navigating fast in complex dynamic environments. To safely avoid fast moving objects, drones need low-latency sensors and algorithms. We departed from state-of-the-art approaches by using event cameras, which are bioinspired sensors with reaction times of microseconds. Our approach exploits the temporal information contained in the event stream to distinguish between static and dynamic objects and leverages a fast st… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
146
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 227 publications
(156 citation statements)
references
References 43 publications
0
146
0
1
Order By: Relevance
“…Some methods use sensors in a passive way, such as event-based cameras [17], RGB-D cameras, and RGB cameras. These sensors are light in weight and rich in information.…”
Section: Methods For Understanding the Environmentmentioning
confidence: 99%
See 1 more Smart Citation
“…Some methods use sensors in a passive way, such as event-based cameras [17], RGB-D cameras, and RGB cameras. These sensors are light in weight and rich in information.…”
Section: Methods For Understanding the Environmentmentioning
confidence: 99%
“…These sensors are light in weight and rich in information. Falanga et al [17] used event-based cameras to achieve outdoor dynamic obstacle avoidance capability but the cost of the event-based camera is too high and it could not recognize generic items. Using RGB-D camera, researchers could generate a global map and compute the location easily, but these visual SLAM and VO methods require powerful computational capability, which is not suitable for UAVs with limited resources.…”
Section: Methods For Understanding the Environmentmentioning
confidence: 99%
“…Then, a controller guided the UAS towards a safe direction opposite to the motion of the incoming obstacle. Recently, the work in [15] employed batches of events collected each 10 ms to perform ego-motion compensation for obstacle detection using a stereo event camera setup mounted in a quadrotor. The UAS performed a reactive avoidance strategy based on potential fields describing obstacles as geometric primitives.…”
Section: State Of the Artmentioning
confidence: 99%
“…The performance of some of the previous event-based methods, such as [18], [21], and [15], depend on finetuning their algorithm parameters. Meta-heuristic optimization strategies such as Simulated Annealing (SA) have been used for parameter tuning in computer vision and robotic applications, such as image segmentation [26], motion blur removal [27], feature selection [28], and robot path planning [29].…”
Section: State Of the Artmentioning
confidence: 99%
“…Biomimicry is derived from the words, bios (Greek) or life or nature, and mimesis (Greek) or imitation (Baumeister, 2012;Hargroves and Smith, 2006). Various industries have adopted biomimicry-based approaches for innovative solutions (Chauhan; Falanga et al, 2020;Hu et al, 2019;Wood, 2019). In medicine, biomimicry involves developing analogs of host endogenous molecules that have evolutionarily adapted to target a given receptor and induce a favorable outcome.…”
Section: Significancementioning
confidence: 99%