2021 29th Conference of Open Innovations Association (FRUCT) 2021
DOI: 10.23919/fruct52173.2021.9435520
|View full text |Cite
|
Sign up to set email alerts
|

Computer Vision System for Landing Platform State Assessment Onboard of Unmanned Aerial Vehicle in Case of Input Visual Information Distortion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…Nevertheless, this technology is not always available or it is sometimes incapable of giving an acceptable level of accuracy, as can happen when the inside of a tank of a petrochemical plant is to be inspected. For this reason, in the literature, complementary landing assistance systems (LASs) are proposed based on computer vision techniques [ 9 , 10 , 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 ], a fusion between computer vision techniques and inertial measurement units (IMUs) [ 20 , 21 , 22 , 23 , 24 , 25 ], computer vision, IMU and ultrasonic sensors [ 26 ], computer vision and a Time-of-Flight-based height sensor [ 27 ], computer vision and GNSS [ 28 , 29 ] and even an approach fusing onboard cameras and a robotic total station [ 30 ]. The main setback of traditional vision-based systems is their strong dependency on weather or lighting conditions.…”
Section: Introductionmentioning
confidence: 99%
“…Nevertheless, this technology is not always available or it is sometimes incapable of giving an acceptable level of accuracy, as can happen when the inside of a tank of a petrochemical plant is to be inspected. For this reason, in the literature, complementary landing assistance systems (LASs) are proposed based on computer vision techniques [ 9 , 10 , 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 ], a fusion between computer vision techniques and inertial measurement units (IMUs) [ 20 , 21 , 22 , 23 , 24 , 25 ], computer vision, IMU and ultrasonic sensors [ 26 ], computer vision and a Time-of-Flight-based height sensor [ 27 ], computer vision and GNSS [ 28 , 29 ] and even an approach fusing onboard cameras and a robotic total station [ 30 ]. The main setback of traditional vision-based systems is their strong dependency on weather or lighting conditions.…”
Section: Introductionmentioning
confidence: 99%