2021
DOI: 10.1016/j.ast.2021.107167
|View full text |Cite
|
Sign up to set email alerts
|

Visual-based obstacle detection and tracking, and conflict detection for small UAS sense and avoid

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
19
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 32 publications
(19 citation statements)
references
References 50 publications
0
19
0
Order By: Relevance
“…Traditional sensors used for surveillance and tracking include ADS-B, onboard radar, and transponders [19]; however, autonomous aircraft require additional sensors both for redundancy and to replace the visual acquisition typically performed by the pilot. For this reason, vision-based traffic detection systems have been proposed, in which intruding aircraft are detected from images taken by a camera sensor mounted on the aircraft [5], [20]. In this section, we will apply our risk-driven design techniques to improve the safety of a vision-based detect and avoid (DAA) system.…”
Section: Vision-based Detect and Avoid Applicationmentioning
confidence: 99%
See 2 more Smart Citations
“…Traditional sensors used for surveillance and tracking include ADS-B, onboard radar, and transponders [19]; however, autonomous aircraft require additional sensors both for redundancy and to replace the visual acquisition typically performed by the pilot. For this reason, vision-based traffic detection systems have been proposed, in which intruding aircraft are detected from images taken by a camera sensor mounted on the aircraft [5], [20]. In this section, we will apply our risk-driven design techniques to improve the safety of a vision-based detect and avoid (DAA) system.…”
Section: Vision-based Detect and Avoid Applicationmentioning
confidence: 99%
“…The design of reliable perception systems is a key challenge in the development of safety-critical autonomous systems [1], [2]. Modern perception systems are often required to predict state information from complex, high-dimensional inputs such as images or LiDAR data [3]- [5]. This information is then passed to a controller, which uses the state estimate to make safety-critical decisions.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…For instance, PP‐YOLO (YOLO based on Paddle)[8] achieves a balance between detection speed and precision on the basis of YOLOv3 [9], while YOLOv5 is an improvement of YOLOv4 to select different backbone networks in various missions based on their respective requirements for detection precision and timeliness. Recent research on UAV detection has been mainly applied to UAV collision avoidance and path planning [5, 6] and they mainly extracted UAV position information through DL‐based target detection algorithms; however, they did not consider how to pre‐process the video information to improve detection accuracy under poor field‐of‐view conditions.…”
Section: Introductionmentioning
confidence: 99%
“…For the safety of surface ships, UAVs must be quickly detected, protected against, and intercepted if encountered at sea. UAVs are mainly detected in such ways as radar [1], acoustic features [2], radio frequency [3,4], and video images [5,6]. The detection based on radar features long distance and good precision, but performs poorly in detecting and tracking the objects of low altitude, slow speed, and small size such as UAVs.…”
Section: Introductionmentioning
confidence: 99%