2019
DOI: 10.3389/fmars.2019.00521
|View full text |Cite
|
Sign up to set email alerts
|

Next-Generation Optical Sensing Technologies for Exploring Ocean Worlds—NASA FluidCam, MiDAR, and NeMO-Net

Abstract: We highlight three emerging NASA optical technologies that enhance our ability to remotely sense, analyze, and explore ocean worlds-FluidCam and fluid lensing, MiDAR, and NeMO-Net. Fluid lensing is the first remote sensing technology capable of imaging through ocean waves without distortions in 3D at sub-cm resolutions. Fluid lensing and the purpose-built FluidCam CubeSat instruments have been used to provide refraction-corrected 3D multispectral imagery of shallow marine systems from unmanned aerial vehicles … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
39
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 35 publications
(39 citation statements)
references
References 55 publications
0
39
0
Order By: Relevance
“…However, we advise caution when considering purchasing these systems, and suggest ensuring that the networks required to run the corrections are available in operator areas of interest. Finally, we look forward to further advances in machine learning as a means to accurately identify and automate the mapping of appropriate ray habitat [73,[76][77][78]. With the large volumes of data captured by drones, they are ideally suited to building artificial intelligence models, so they present an opportunity as much as a challenge.…”
Section: Overcoming Drone-based Habitat Mapping Challengesmentioning
confidence: 99%
“…However, we advise caution when considering purchasing these systems, and suggest ensuring that the networks required to run the corrections are available in operator areas of interest. Finally, we look forward to further advances in machine learning as a means to accurately identify and automate the mapping of appropriate ray habitat [73,[76][77][78]. With the large volumes of data captured by drones, they are ideally suited to building artificial intelligence models, so they present an opportunity as much as a challenge.…”
Section: Overcoming Drone-based Habitat Mapping Challengesmentioning
confidence: 99%
“…Using airborne images to analyze underwater environments requires (a) calm water surfaces, (b) is dependent on the Secchi depth (visibility by means of turbidity) and (c) is dependent on surface textures of underwater areas (Mandlburger 2019). Within the water column, light is continuously attenuated through absorption and scattering along the optical path (Chirayath and Earle 2016;Chirayath and Li 2019). Bathymetric reconstruction in calmer and deeper waters with user grade UAVs (and without LiDAR sensors) is possible, but is characterized by large uncertainties and becomes more challenging with increasing water depths and hydrodynamic activity (Casella et al 2017).…”
Section: Introductionmentioning
confidence: 99%
“…Still, these results should encourage further studies on the proposed approach, such as using ROV systems to produce higher-quality images for monitoring deep seafloor areas [30]. Moreover, studying the same proposed approaches with the new Fluid lensing optical multispectral instrument (FluidCam) and Multispectral Imaging Detection and Active Reflectance instrument (MiDAR) developed by NASA [62]. FluidCam can produce 3D multispectral images corrected from refraction for shallow marine environments.…”
Section: Discussionmentioning
confidence: 88%