2020
DOI: 10.32614/rj-2020-026
|View full text |Cite
|
Sign up to set email alerts
|

BayesMallows: An R Package for the Bayesian Mallows Model

Abstract: BayesMallows is an R package for analyzing preference data in the form of rankings with the Mallows rank model, and its finite mixture extension, in a Bayesian framework. The model is grounded on the idea that the probability density of an observed ranking decreases exponentially with the distance to the location parameter. It is the first Bayesian implementation that allows wide choices of distances, and it works well with a large amount of items to be ranked. BayesMallows handles non-standard data: partial r… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

3
3

Authors

Journals

citations
Cited by 10 publications
(13 citation statements)
references
References 25 publications
0
13
0
Order By: Relevance
“…Enumeration for n ≤ 9 When the number of items n is small, it is possible to enumerate all possible rankings {i 1 , ..., i n } ∈ P n , and compute the KL-divergence between the Pseudo-Mallows distribution and the Mallows posterior. We first generate full ranking datasets with n items and N users by drawing N independent rankings from the Mallows distribution with α 0 and ρ 0 using the BayesMallows R package (Sørensen et al, 2020). For convenience, we fix ρ 0 = {1, 2, ..., n}.…”
Section: Empirical Study Of the Optimal Pseudo-mallows Conjecturementioning
confidence: 99%
See 2 more Smart Citations
“…Enumeration for n ≤ 9 When the number of items n is small, it is possible to enumerate all possible rankings {i 1 , ..., i n } ∈ P n , and compute the KL-divergence between the Pseudo-Mallows distribution and the Mallows posterior. We first generate full ranking datasets with n items and N users by drawing N independent rankings from the Mallows distribution with α 0 and ρ 0 using the BayesMallows R package (Sørensen et al, 2020). For convenience, we fix ρ 0 = {1, 2, ..., n}.…”
Section: Empirical Study Of the Optimal Pseudo-mallows Conjecturementioning
confidence: 99%
“…For each dataset, we then run the Mallows MCMC and the Pseudo-Mallows, as described in Algorithm 3, for a grid of different numbers of iterations. After running the algorithms, we obtain a point estimate for the consensus parameter ρ by calculating the CP consensus (Sørensen et al, 2020) based on the samples obtained by the algorithm. Then we record the footrule distance between the CP consensus and the truth ρ 0 in order to assess the estimation accuracy of both algorithms for a given computing time.…”
Section: Infer ρ From Full Ranking Datamentioning
confidence: 99%
See 1 more Smart Citation
“…To perform inference, Vitelli et al 17 proposed a MCMC algorithm based on a Metropolis-Hastings (MH) scheme (see Sørensen et al 18 for details on the implementation).…”
Section: Bmm For Complete Datamentioning
confidence: 99%
“…Our current work is based on the Bayesian Mallows model (BMM) 17 with implementing solution BayesMallows. 18 BayesMallows already provides a computationally feasible inferential approach for the most important choices of distance among permutations, and it shows good accuracy when compared to competitors on datasets of moderate size. 19 However, for this method to be applicable to the typical data dimensions in -omics applications, variable selection is crucial.…”
Section: Introductionmentioning
confidence: 99%