Megafauna play an important role in benthic ecosystem function and are sensitive indicators of environmental change. Non-invasive monitoring of benthic communities can be accomplished by seafloor imaging. However, manual quantification of megafauna in images is labor-intensive and therefore, this organism size class is often neglected in ecosystem studies. Automated image analysis has been proposed as a possible approach to such analysis, but the heterogeneity of megafaunal communities poses a non-trivial challenge for such automated techniques. Here, the potential of a generalized object detection architecture, referred to as iSIS (intelligent Screening of underwater Image Sequences), for the quantification of a heterogenous group of megafauna taxa is investigated. The iSIS system is tuned for a particular image sequence (i.e. a transect) using a small subset of the images, in which megafauna taxa positions were previously marked by an expert. To investigate the potential of iSIS and compare its results with those obtained from human experts, a group of eight different taxa from one camera transect of seafloor images taken at the Arctic deep-sea observatory HAUSGARTEN is used. The results show that inter- and intra-observer agreements of human experts exhibit considerable variation between the species, with a similar degree of variation apparent in the automatically derived results obtained by iSIS. Whilst some taxa (e. g. Bathycrinus stalks, Kolga hyalina, small white sea anemone) were well detected by iSIS (i. e. overall Sensitivity: 87%, overall Positive Predictive Value: 67%), some taxa such as the small sea cucumber Elpidia heckeri remain challenging, for both human observers and iSIS.
Cold-water coral reefs are recognised as important biodiversity hotspots on the continental margin. The location of terrain features likely to be associated with living reef has been made easier by recent developments in acoustic sensing technology. For accurate assessment and finescale mapping of these newly identified coral habitats, analysis of video data is still required. In the present study we explore the potential of manual and automatic abundance estimation of cold-water corals and sponges from still image frames extracted from video footage from Tisler Reef (Skagerrak, Norway). The results and processing times from 3 standard visual assessment methods (15-point quadrat, 100-point quadrat and frame mapping) are compared with those produced by a new computer vision system. This system uses machine-learning algorithms to detect species within frames automatically. Cold-water coral density estimates obtained from the automated method were similar to those gained by the other methods. The automated method slightly underestimated (by 10 to 20%) coral coverage in frames which lacked a uniform seabed illumination. However, it did much better in the detection of small live coral fragments than the 15-point method. For assessing sponge coverage, the automated system did not perform as satisfactorily. It mistook a percentage of the seabed for sponge (0.1 to 2% of most frames) and underestimated sponge coverage in frames that contained many sponges. Results indicate that the machine-learning approach is appropriate for estimating live cold-water coral density, but further work is required before the system can be applied to sponges within the reef environment.
We present results from the first geological field tests of the 'Cyborg Astrobiologist', which is a wearable computer and video camcorder system that we are using to test and train a computer-vision system towards having some of the autonomous decision-making capabilities of a field-geologist and field-astrobiologist. The Cyborg Astrobiologist platform has thus far been used for testing and development of these algorithms and systems: robotic acquisition of quasi-mosaics of images, real-time image segmentation, and real-time determination of interesting points in the image mosaics. The hardware and software systems function reliably, and the computer-vision algorithms are adequate for the first field tests. In addition to the proof-of-concept aspect of these field tests, the main result of these field tests is the enumeration of those issues that we can improve in the future, including: first, detection and accounting for shadows caused by 3D jagged edges in the outcrop; second, reincorporation of more sophisticated texture-analysis algorithms into the system; third, creation of hardware and software capabilities to control the camera's zoom lens in an intelligent manner; and fourth, development of algorithms for interpretation of complex geological scenery. Nonetheless, despite these technical inadequacies, this Cyborg Astrobiologist system, consisting of a camera-equipped wearable-computer and its computer-vision algorithms, has demonstrated its ability of finding genuinely interesting points in real-time in the geological scenery, and then gathering more information about these interest points in an automated manner.
The 'Cyborg Astrobiologist' has undergone a second geological field trial, at a site in northern Guadalajara, Spain, near Riba de Santiuste. The site at Riba de Santiuste is dominated by layered deposits of red sandstones. The Cyborg Astrobiologist is a wearable computer and video camera system that has demonstrated a capability to find uncommon interest points in geological imagery in real-time in the field. In this second field trial, the computer vision system of the Cyborg Astrobiologist was tested at seven different tripod positions, on three different geological structures. The first geological structure was an outcrop of nearly homogeneous sandstone, which exhibits oxidizediron impurities in red and and an absence of these iron impurities in white. The white areas in these "red beds" have turned white because the iron has been removed. The iron removal from the sandstone can proceed once the iron has been chemically reduced, perhaps by a biological agent. The computer vision system found in one instance several (iron-free) white spots to be uncommon and therefore interesting, as well as several small and dark nodules. The second geological structure was another outcrop some 600 meters to the east, with white, textured mineral deposits on the surface of the sandstone, at the bottom of the outcrop. The computer vision system found these white, textured mineral deposits to be interesting. We acquired samples of the mineral deposits for geochemical analysis in the laboratory. This laboratory analysis of the crust identifies a double layer, consisting of an internal millimeter-size layering of calcite and an external centimeter-size effluorescence of gypsum. The third geological structure was a 50 cm thick paleosol layer, with fossilized root structures of some plants. The computer vision system also found certain areas of these root structures to be interesting. A quasi-blind comparison of the Cyborg Astrobiologist's interest points for these images with the interest points determined afterwards by a human geologist shows that the Cyborg Astrobiologist concurred with the human geologist 68% of the time (true positive rate), with a 32% false positive rate and a 32% false negative rate. The performance of the Cyborg Astrobiologist's computer vision system was by no means perfect, so there is plenty of room for improvement. However, these tests validate the image-segmentation and uncommon-mapping technique that we first employed at a different geological site (Rivas Vaciamadrid) with somewhat different properties of the imagery.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.