2019 IEEE International Conference on Systems, Man and Cybernetics (SMC) 2019
DOI: 10.1109/smc.2019.8913896
|View full text |Cite
|
Sign up to set email alerts
|

Model Selection of Bayesian Hierarchical Mixture of Experts based on Variational Inference

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…To attain high prediction performance with interpretability, previous works have proposed some extensions of linear regression model (e.g., [1][2][3][4][5][6][7][8][9][10][11][12][13][14]). In stratified regression [1], the data are stratified based on design variables, and at each level, a linear regression model is defined using explanatory variables and response variables.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…To attain high prediction performance with interpretability, previous works have proposed some extensions of linear regression model (e.g., [1][2][3][4][5][6][7][8][9][10][11][12][13][14]). In stratified regression [1], the data are stratified based on design variables, and at each level, a linear regression model is defined using explanatory variables and response variables.…”
Section: Introductionmentioning
confidence: 99%
“…In stratified regression [1], the data are stratified based on design variables, and at each level, a linear regression model is defined using explanatory variables and response variables. Another extension of the linear regression model is Mixture of Experts Model (e.g., [2,3]) or Hierarchical Mixture of Experts Model (e.g., [4][5][6][7][8]), which consists of some experts and a gating function. Each expert is a linear regression model that outputs the response variable.…”
Section: Introductionmentioning
confidence: 99%