2020
DOI: 10.1002/ecy.2953
|View full text |Cite
|
Sign up to set email alerts
|

Improving spatial predictions of animal resource selection to guide conservation decision making

Abstract: Resource selection is often studied by ecologists interested in the environmental drivers of animal space use and movement. These studies commonly produce spatial predictions, which are of considerable utility to resource managers making habitat and population management decisions. It is thus paramount that predictions from resource selection studies are accurate. We evaluated model building and fitting strategies for optimizing resource selection function predictions in a use‐availability framework. We did so… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
23
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(23 citation statements)
references
References 37 publications
0
23
0
Order By: Relevance
“…The Bayesian Lasso (Park and Casella 2008) protects against possible overfitting and predictor collinearity in a multivariate modeling framework by shrinking coefficient estimates toward zero when they are not well-supported by the data (Hooten andHobbs 2015, Authier et al 2017). A form of regularization (i.e., optimizing model fit by penalizing greater numbers of parameters), the Lasso results in better predictions and model stability in multiple regression models, particularly those with increasing numbers of candidate predictors (Tibshirani et al 2012, Gerber andNorthrup 2020). We estimated model parameters in JAGS 4.2.0 (Plummer 2003) using packages rjags (Plummer 2018) and jagsUI (Kellner 2018) within R 3.6.3 (R Core Team 2019).…”
Section: Statistical Analysesmentioning
confidence: 99%
“…The Bayesian Lasso (Park and Casella 2008) protects against possible overfitting and predictor collinearity in a multivariate modeling framework by shrinking coefficient estimates toward zero when they are not well-supported by the data (Hooten andHobbs 2015, Authier et al 2017). A form of regularization (i.e., optimizing model fit by penalizing greater numbers of parameters), the Lasso results in better predictions and model stability in multiple regression models, particularly those with increasing numbers of candidate predictors (Tibshirani et al 2012, Gerber andNorthrup 2020). We estimated model parameters in JAGS 4.2.0 (Plummer 2003) using packages rjags (Plummer 2018) and jagsUI (Kellner 2018) within R 3.6.3 (R Core Team 2019).…”
Section: Statistical Analysesmentioning
confidence: 99%
“…Structured cross-validation methods that partition data non-randomly into spatially or temporally distinct folds can more accurately assess transferability of SDMs (Araújo et al, 2005; Radosavljevic & Anderson, 2014;Roberts et al, 2017). Structuring cross-validation folds in space allows investigators to optimize model complexity for transferability while avoiding the overfitting of species-environment relationships (Anderson & Gonzalez, 2011;Gerber & Northrup, 2019). Roberts et al (2017) discussed numerous aspects of cross-validation design that affect reliability of predictive accuracy estimates, yet the delineation of spatial folds is not always intuitive and there are no one-size-fits-all approaches for spatially structuring cross-validation when the goal is to extrapolate predictions to other portions of the species' range.…”
Section: Introductionmentioning
confidence: 99%
“…Regularization parameters are often selected arbitrarily or fixed in advance by investigators (e.g., using default settings of Maxent; Phillips et al, 2006), yet statistical regularization can also be implemented concurrently with optimizing the predictive performance of SDMs. In this case, regularization parameter values are selected to optimize predictive performance of resulting models, given the observed data (Gerber & Northrup, 2019;Hastie et al, 2009). This range of regularization parameters may be necessary to optimize model complexity for transferability.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations