2016 24th European Signal Processing Conference (EUSIPCO) 2016
DOI: 10.1109/eusipco.2016.7760604
|View full text |Cite
|
Sign up to set email alerts
|

Classification of multiple annotator data using variational Gaussian process inference

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
9
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(9 citation statements)
references
References 9 publications
0
9
0
Order By: Relevance
“…Finally, to model the underlying real labels z given the features X, Gaussian Processes (GP) has proven to be the most successful probabilistic approach, mainly because of its great flexibility and excellent uncertainty quantification [13,21,22]…”
Section: Probabilistic Modellingmentioning
confidence: 99%
See 4 more Smart Citations
“…Finally, to model the underlying real labels z given the features X, Gaussian Processes (GP) has proven to be the most successful probabilistic approach, mainly because of its great flexibility and excellent uncertainty quantification [13,21,22]…”
Section: Probabilistic Modellingmentioning
confidence: 99%
“…Then, given each latent variable f n , the underlying real label z n is modelled with the sigmoid function σ, p(z n = 1|f n ) = σ(f n ) = (1 + exp(−f n )) −1 . Under this common classical model, the main difference between the previous approaches [13] and [21,22] is the inference procedure used: Expectation Propagation [32] in the former and Variational Inference [19,20] in the latter (recall the second paragraph of Section 1). Figure 1a) shows a graphical representation of this GP-based classical model, which is in the basis of our proposal.…”
Section: Probabilistic Modellingmentioning
confidence: 99%
See 3 more Smart Citations