2018
DOI: 10.1080/09296174.2018.1499457
|View full text |Cite
|
Sign up to set email alerts
|

Confronting Quasi-Separation in Logistic Mixed Effects for Linguistic Data: A Bayesian Approach

Abstract: Mixed effects regression models are widely used by language researchers. However, these regressions are implemented with an algorithm which may not converge on a solution. While convergence issues in linear mixed effects models can often be addressed with careful experiment design and model building, logistic mixed effects models introduce the possibility of separation or quasi-separation, which can cause problems for model estimation that result in convergence errors or in unreasonable model estimates. These … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 16 publications
(12 citation statements)
references
References 41 publications
0
12
0
Order By: Relevance
“…This may actually be tremendously useful for people using hierarchical linear models. When using appropriate random effect structures (see Barr et al, 2013;Bates et al, 2015), these models are known to run into convergence issues (e.g., Kimball, Shantz, Eager, & Roy, 2018). To remedy such convergence issues, a common strategy is to drop complex random effect terms incrementally (Matuschek, Kliegl, Vasishth, Baayen, & Bates, 2017).…”
Section: Preregistrations and Registered Reportsmentioning
confidence: 99%
“…This may actually be tremendously useful for people using hierarchical linear models. When using appropriate random effect structures (see Barr et al, 2013;Bates et al, 2015), these models are known to run into convergence issues (e.g., Kimball, Shantz, Eager, & Roy, 2018). To remedy such convergence issues, a common strategy is to drop complex random effect terms incrementally (Matuschek, Kliegl, Vasishth, Baayen, & Bates, 2017).…”
Section: Preregistrations and Registered Reportsmentioning
confidence: 99%
“…The first analysis asked whether the three participant groups showed differential effects of Cognate Status, Noise Condition, or their interaction. Unfortunately, due to model convergence issues (perhaps the result of quasi-separation; Kimball et al, 2019 ), it was not possible to fit a sufficiently complex mixed effects logistic regression to the accuracy data. We therefore present descriptive statistics for both real words and nonwords in order to qualitatively evaluate response strategies across groups and conditions.…”
Section: Analysis and Resultsmentioning
confidence: 99%
“…Another practical reason is that complex random effects structures that are often required for linguistic data analysis often do not, or not easily, converge with lme4. In contrast, complex random effects structures are more likely to converge when fitting Bayesian models (Eager & Roy, 2017; Kimball et al., 2019; Sorensen & Vasishth, 2015).…”
Section: A Hands‐on Poisson Regression Analysis Examplementioning
confidence: 99%