Remote cameras are a common method for surveying wildlife and recently have been promoted for implementing large-scale regional biodiversity monitoring programs.The use of camera-trap data depends on the correct identification of animals captured in the photographs, yet misidentification rates can be high, especially when morphologically similar species co-occur, and this can lead to faulty inferences and hinder conservation efforts. Correct identification is dependent on diagnosable taxonomic characters, photograph quality, and the experience and training of the observer. However, keys rooted in taxonomy are rarely used for the identification of camera-trap images and error rates are rarely assessed, even when morphologically similar species are present in the study area. We tested a method for ensuring high identification accuracy using two sympatric and morphologically similar chipmunk (Neotamias) species as a case study. We hypothesized that the identification accuracy would improve with use of the identification key and with observer training, resulting in higher levels of observer confidence and higher levels of agreement among observers. We developed an identification key and tested identification accuracy based on photographs of verified museum specimens. Our results supported predictions for each of these hypotheses. In addition, we validated the method in the field by comparing remote-camera data with live-trapping data. We recommend use of these methods to evaluate error rates and to exclude ambiguous records in camera-trap datasets. We urge that ensuring correct and scientifically defensible species identifications is incumbent on researchers and should be incorporated into the camera-trap workflow.
Occupancy models are commonly used with motion-sensitive camera data to estimate patterns of species occurrence while accounting for false negative detection error (i.e., the species is present but not detected). False positive detection error (i.e., the species is not present but is detected) is present in camera data sets, especially when morphologically similar species cooccur. Researchers use different approaches to address this problem: ignore the potential for false positive detections, remove all ambiguous detections and treat them as nondetections, or model false positive detection error by dividing detections into ambiguous detections (could be true or false positives) and unambiguous detections (true positives). We performed a simulation study to compare these 3 strategies. To implement these modeling strategies, detections must be classified as ambiguous or unambiguous, or all ambiguous detections must be re-classified as non-detections. We also performed a simulation study to assess the impact of researcher confidence in the designation of ambiguous and unambiguous detections. Ignoring false positive detection error resulted in biased parameter estimates, whereas removing ambiguous detections and modeling false positive detections resulted in similar estimates of occupancy probability (ψ) in most situations. Researcher over-confidence (i.e., the tendency for observers to overestimate their own ability) positively biased estimates of ψ. Moderate under-confidence did not increase bias or decrease precision in estimates of ψ.Consistent with the patterns observed in simulations, analysis of example data from a chipmunk (Neotamias minimus
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.