2018
DOI: 10.48550/arxiv.1811.11103
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Bayesian graph convolutional neural networks for semi-supervised classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 0 publications
0
7
0
Order By: Relevance
“…Most methods from the above two classes treat the graph as fixed observation and only a few methods [16,22,12] treat the graph as a random variable and model it with generative models. Among them, Ng et al [16] focus more on an active learning setting on graphs; Zhang et al [22] models the graph along with a stochastic block model and does not consider the interaction between the graph and the features or the labels in the generative model of the graph; the work of Liu [12] is the most similar one as our framework, but is restricted in the linear models. To our best knowledge, our work is the first to propose a generative framework for graph-based semi-supervised learning that models the joint distribution of features, outcomes, and the graph with flexible nonlinear models.…”
Section: Generative Methods For Graph-based Semi-supervised Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Most methods from the above two classes treat the graph as fixed observation and only a few methods [16,22,12] treat the graph as a random variable and model it with generative models. Among them, Ng et al [16] focus more on an active learning setting on graphs; Zhang et al [22] models the graph along with a stochastic block model and does not consider the interaction between the graph and the features or the labels in the generative model of the graph; the work of Liu [12] is the most similar one as our framework, but is restricted in the linear models. To our best knowledge, our work is the first to propose a generative framework for graph-based semi-supervised learning that models the joint distribution of features, outcomes, and the graph with flexible nonlinear models.…”
Section: Generative Methods For Graph-based Semi-supervised Learningmentioning
confidence: 99%
“…A few previous work [22,12] trying to apply generative models in graph-based semi-supervised learning have been restricted to relatively simple model families due to the difficulty in efficient training of generative models. Thanks to recent advances of scalable variational inference techniques [8,7], we are able to propose a flexible generative framework for graph-based semisupervised learning.…”
Section: Introductionmentioning
confidence: 99%
“…We note that recent work [33] also claims to estimate uncertainty in the graph convolutional neural networks setting. This work uses a two-stage strategy: it firstly takes the observed network as a realisation from a parametric Bayesian relational model, and then uses Bayesian Neural Networks to infer the model parameters.…”
Section: Related Workmentioning
confidence: 99%
“…This function is often approximated using a fixed neural network that provides a likely correlational explanation for the relationship between x and y. There has been good work done in this area [12] and there are two main types of Bayesian Deep Learning. This is not an exhaustive list of all the methods, but a broad overview of two.…”
Section: Bayesian Deep Learningmentioning
confidence: 99%
“…This integral is often intractable in practice, and number of techniques have been proposed in the literature to overcome [12] this problem. Intuitively, a Bayesian neural network is encoding information about the distribution over output values given certain inputs.…”
Section: Bayesian Deep Learningmentioning
confidence: 99%