2018
DOI: 10.1002/ece3.4751
|View full text |Cite
|
Sign up to set email alerts
|

A local evaluation of the individual state‐space to scale up Bayesian spatial capture–recapture

Abstract: Spatial capture–recapture models (SCR) are used to estimate animal density and to investigate a range of problems in spatial ecology that cannot be addressed with traditional nonspatial methods. Bayesian approaches in particular offer tremendous flexibility for SCR modeling. Increasingly, SCR data are being collected over very large spatial extents making analysis computational intensive, sometimes prohibitively so. To mitigate the computational burden of large‐scale SCR models, we developed an improved formul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
51
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
8
1

Relationship

8
1

Authors

Journals

citations
Cited by 31 publications
(51 citation statements)
references
References 30 publications
0
51
0
Order By: Relevance
“…This challenge is amplified in our analysis due to the size of the problem (number of individuals, spatial and temporal extent). For this reason, we implemented additional approaches to substantially reduce computation time, including 1) a binomial observation model that allowed substantial reduction of the number of detectors (and therefore runtime) without compromising the precision and accuracy of model estimates (48) and 2) removing unnecessary evaluation of the likelihood whenever the distance between a detector and a predicted AC location was larger than a distance threshold (49). This distance was adjusted for each species and sex to maximize the efficiency of the local evaluation.…”
Section: Methodsmentioning
confidence: 99%
“…This challenge is amplified in our analysis due to the size of the problem (number of individuals, spatial and temporal extent). For this reason, we implemented additional approaches to substantially reduce computation time, including 1) a binomial observation model that allowed substantial reduction of the number of detectors (and therefore runtime) without compromising the precision and accuracy of model estimates (48) and 2) removing unnecessary evaluation of the likelihood whenever the distance between a detector and a predicted AC location was larger than a distance threshold (49). This distance was adjusted for each species and sex to maximize the efficiency of the local evaluation.…”
Section: Methodsmentioning
confidence: 99%
“…Bayesian SCR models are often preferred over maximum likelihood approaches because they allow for customized analyses and are highly flexible (Royle et al 2014, Proffitt et al 2015, Whittington et al 2018. These advantages can, however, result in excessive computational demand, making them inefficient for many analysis problems (Milleret et al 2018(Milleret et al , 2019. Despite large improvements in computer processing speed and ease of parallel processing, it is common for models to run for days or even a month.…”
mentioning
confidence: 99%
“…We also estimated year‐specific p 0 to account for annual variation in sampling intensity. To increase computing efficiency, we used a local evaluation of the state space to reduce the number of detectors considered for each individual during the model fit (Milleret et al 2019, Sutherland et al 2019). Searches were conducted continuously from 2013 to 2017, which allowed us to introduce different artificial gaps in the data time series, while having a reference point (scenario without gaps: 11111).…”
Section: Methodsmentioning
confidence: 99%