2019
DOI: 10.1109/tpds.2019.2892972
|View full text |Cite
|
Sign up to set email alerts
|

An Empirical Study on Distributed Bayesian Approximation Inference of Piecewise Sparse Linear Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 21 publications
0
7
0
Order By: Relevance
“…More details are provided in Supplementary Methods. The interpretable machine learning can solve data-classification and data-regression problems simultaneously, by maximizing a novel information criterion (factorized information criterion, which is referred to as FIC) with an Expectation-Maximization-like algorithm (factorized asymptotic Bayesian inference, referred to as FAB), thus constructing a piecewise sparse linear model 12,13 . Figure 4a, b shows the visualization of the model constructed by the interpretable machine learning.…”
Section: Machine Learning Modeling By the Interpretable Machine Learningmentioning
confidence: 99%
See 3 more Smart Citations
“…More details are provided in Supplementary Methods. The interpretable machine learning can solve data-classification and data-regression problems simultaneously, by maximizing a novel information criterion (factorized information criterion, which is referred to as FIC) with an Expectation-Maximization-like algorithm (factorized asymptotic Bayesian inference, referred to as FAB), thus constructing a piecewise sparse linear model 12,13 . Figure 4a, b shows the visualization of the model constructed by the interpretable machine learning.…”
Section: Machine Learning Modeling By the Interpretable Machine Learningmentioning
confidence: 99%
“…The factorized asymptotic Bayesian inference hierarchical mixture of experts (FAB/HMEs) constructs a piecewise sparse linear model that assigns sparse linear experts to individual partitions in feature space and expresses whole models as patches of local experts 12,13 . By maximizing the factorized information criterion including two L 0 -regularizations for partitionstructure determinations and feature selection for individual experts, the FAB/HMEs performs the partition-structure determination and feature selection at the same time.…”
Section: Fab/hmesmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, there have been other reports on the interpretation of models with the aim of discovering new mechanisms [12,13]. Factorized asymptotic Bayesian inference hierarchical mixture of experts (FAB/HMEs) were developed [14,15] and were applied to extract knowledge on spin-driven thermoelectric materials [16]. Although FAB/ HMEs have interpretability, sparsity, and predictive ability of models, nonlinearity between X and y dividing samples into components is limited to squared terms and cross terms of X.…”
Section: Introductionmentioning
confidence: 99%