Successful marine management relies on understanding patterns of human use. However, obtaining data can be difficult and expensive given the widespread and variable nature of activities conducted. Remote camera systems are increasingly used to overcome cost limitations of conventional labour‐intensive methods. Still, most systems face trade‐offs between the spatial extent and resolution over which data are obtained, limiting their application. We trialed a novel methodology, CSIRO Ruggedized Autonomous Gigapixel System (CRAGS), for time series of high‐resolution photo‐mosaic (HRPM) imagery to estimate fine‐scale metrics of human activity at an artificial reef located 1.3 km from shore. We compared estimates obtained using the novel system to those produced with a web camera that concurrently monitored the site. We evaluated the effect of day type (weekday/weekend) and time of day on each of the systems and compared to estimates obtained from binocular observations. In general, both systems delivered similar estimates for the number of boats observed and to those obtained by binocular counts; these results were also unaffected by the type of day (weekend vs. weekday). CRAGS was able to determine additional information about the user type and party size that was not possible with the lower resolution webcam system. However, there was an effect of time of day as CRAGS suffered from poor image quality in early morning conditions as a result of fixed camera settings. Our field study provides proof of concept of use of this new cost‐effective monitoring tool for the remote collection of high‐resolution large‐extent data on patterns of human use at high temporal frequency.
Collecting data on unlicensed open‐access coastal activities, such as some types of recreational fishing, has often relied on telephone interviews selected from landline directories. However, this approach is becoming obsolete due to changes in communication technology such as a switch to unlisted mobile phones. Other methods, such as boat ramp interviews, are often impractical due to high labor cost. We trialed an autonomous, ultra‐high‐resolution photosampling method as a cost effect solution for direct measurements of a recreational fishery. Our sequential photosampling was batched processed using a novel software application to produce “big data” time series movies from a spatial subset of the fishery, and we validated this with a regional bus‐route survey and interviews with participants at access points. We also compared labor costs between these two methods. Most trailer boat users were recreational fishers targeting tuna spp. Our camera system closely matched trends in temporal variation from the larger scale regional survey, but as the camera data were at much higher frequency, we could additionally describe strong, daily variability in effort. Peaks were normally associated with weekends, but consecutive weekend tuna fishing competitions led to an anomaly of high effort across the normal weekday lulls. By reducing field time and batch processing imagery, Monthly labor costs for the camera sampling were a quarter of the bus‐route survey; and individual camera samples cost 2.5% of bus route samples to obtain. Gigapixel panoramic camera observations of fishing were representative of the temporal variability of regional fishing effort and could be used to develop a cost‐efficient index. High‐frequency sampling had the added benefit of being more likely to detect abnormal patterns of use. Combinations of remote sensing and on‐site interviews may provide a solution to describing highly variable effort in recreational fisheries while also validating activity and catch.
Electronic monitoring (EM) is increasingly used to monitor catch and bycatch in wild capture fisheries. EM video data is still manually reviewed and adds to on-going management costs. Computer vision, machine learning, and artificial intelligence-based systems are seen to be the next step in automating EM data workflows. Here we show some of the obstacles we have confronted, and approaches taken as we develop a system to automatically identify and count target and bycatch species using cameras deployed to an industry vessel. A Convolutional Neural Network was trained to detect and classify target and bycatch species groups, and a visual tracking system was developed to produce counts. The multiclass detector achieved a mean Average Precision of 53.42%. Based on the detection results, the visual tracking system provided automatic fish counts for the test video data. Automatic counts were within two standard deviations of the manual counts for the target species, and most times for the bycatch species. Unlike other recent attempts, weather and lighting conditions were largely controlled by mounting cameras under cover.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.