2018
DOI: 10.11591/eecsi.v5.1681
|View full text |Cite
|
Sign up to set email alerts
|

A Relative Rotation between Two Overlapping UAV's Images

Abstract: In this paper, we study the influence of varying baseline components on the accuracy of a relative rotation between two overlapping aerial images taken form unmanned aerial vehicle (UAV) flight. The case is relevant when mosaicking UAV's aerial images by registering each individual image. Geotagged images facilitated by a navigational grade GPS receiver on board inform the camera position when taking pictures. However, these low accuracies of geographical coordinates encoded in an EXIF format are unreliable to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 22 publications
0
8
0
Order By: Relevance
“…Here i and j represent information pertaining to the i ℎ image out of the m images [17,19] and relative orientation methods [20,21], whereas a spatial position of the object space points are calculated by space intersection [22]. The generation of initial approximate values for all parameters is required in (1) because the photogrammetric observation equations are non-linear with respect to the parameters and must be iteratively solved using a linearized set of equations.…”
Section: Methodsmentioning
confidence: 99%
“…Here i and j represent information pertaining to the i ℎ image out of the m images [17,19] and relative orientation methods [20,21], whereas a spatial position of the object space points are calculated by space intersection [22]. The generation of initial approximate values for all parameters is required in (1) because the photogrammetric observation equations are non-linear with respect to the parameters and must be iteratively solved using a linearized set of equations.…”
Section: Methodsmentioning
confidence: 99%
“…The self-calibration method is performed by means of the bundle adjustment [6] which provides a simultaneous determination of the interior and exterior orientation parameters, as well as the object point coordinates performed with or without a provision of known coordinates of control points [11], [13], [15], [21]-- [23]. However, while object space control points of known 3D coordinates are available, calibration parameters can be recovered using space resection methods [24]-- [26], relative orientation methods [27], [28], or bundle adjustment with fixed 3D coordinates. Before elaborating the result, a brief discussion about the self-calibration method is following.…”
Section: Introductionmentioning
confidence: 99%
“…Other parameters that must also be known are the EO parameters of each image, and pixel intervals. The EO parameters consisting the camera position while taking the image or its perspective center of (X , Y , Z ) and a rotation matrix R composed of , , -rotation angles are determined in bundle adjustment computation or using other robust methods such as a single image resection [31--33] and relative orientation [34,35]. Furthermore, all of the pixel spacing in a metric unit of the digital camera, the DSM cell-size in ground unit, as well as the bounding box rectangle of the DSM pixels in a map projection grid must be determined in advanced.…”
Section: Introductionmentioning
confidence: 99%