2016
DOI: 10.3390/s16040536
|View full text |Cite
|
Sign up to set email alerts
|

An Alignment Method for the Integration of Underwater 3D Data Captured by a Stereovision System and an Acoustic Camera

Abstract: The integration of underwater 3D data captured by acoustic and optical systems is a promising technique in various applications such as mapping or vehicle navigation. It allows for compensating the drawbacks of the low resolution of acoustic sensors and the limitations of optical sensors in bad visibility conditions. Aligning these data is a challenging problem, as it is hard to make a point-to-point correspondence. This paper presents a multi-sensor registration for the automatic integration of 3D data acquir… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 37 publications
(12 citation statements)
references
References 34 publications
0
12
0
Order By: Relevance
“…Fusing inertial with vision-based navigation will improve both the estimation quality and the system’s robustness. Additionally, the system can be made more robust to low visibility by using a 3D sonar, either instead of the stereo camera, or by fusing the acoustic with vision data in a manner similar to that described by [ 36 ]. Ongoing testing will provide information on the quality of a 3D sonar used as the only sensor, using the same algorithm described in this paper.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Fusing inertial with vision-based navigation will improve both the estimation quality and the system’s robustness. Additionally, the system can be made more robust to low visibility by using a 3D sonar, either instead of the stereo camera, or by fusing the acoustic with vision data in a manner similar to that described by [ 36 ]. Ongoing testing will provide information on the quality of a 3D sonar used as the only sensor, using the same algorithm described in this paper.…”
Section: Discussionmentioning
confidence: 99%
“…Most sonars are however not suitable for high precision close range applications due to their low resolution and relatively high minimum operating range. A hybrid vision-acoustic approach has been proposed by [ 36 ] using a high-end commercial 3D sonar, which aims to provide the robustness of sonars with the colour information and accuracy of cameras. Such a sonar could be used on its own in a similar way to a depth camera, except for the absence of colour information.…”
Section: Introductionmentioning
confidence: 99%
“…The use scenario to which the UWAR technology developed in the iMARECulture project is intended consists in providing to the divers the possibility to know their position within the submerged environment and to enable the augmented visualization of the actual conditions of the ancient ruins in the underwater site and a hypothetical reconstruction of the villa, thus easily understanding the luxury and the importance of that building during the Roman era. The starting point for preparing the 3D data and contents that are necessary for the functioning of the proposed UWAR technologies consists of the 3D reconstruction of the abovementioned archaeological area that has been carried out by combining optical and acoustic techniques [Lagudi et al 2016;Mangeruga et al 2018]. The 3D reconstruction model is then populated with a number of points of interest (POIs), placed on the seafloor, which provide the position of the distinctive and characteristic element of the specific underwater site.…”
Section: The Case-study: the Underwater Archaeological Park Of Baiaementioning
confidence: 99%
“…Murino and Fusiello used sonar and camera to model underwater scenes [24]. Lagudi et al proposed alignment method for the integration of stereo cameras and an acoustic camera for underwater 3D data capture [22]. Recently Ye at al.…”
Section: Introductionmentioning
confidence: 99%