Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining 2018
DOI: 10.1145/3159652.3159731
|View full text |Cite
|
Sign up to set email alerts
|

Neural Graph Learning

Abstract: Label propagation is a powerful and flexible semi-supervised learning technique on graphs. Neural networks, on the other hand, have proven track records in many supervised learning tasks. In this work, we propose a training framework with a graph-regularised objective, namely Neural Graph Machines, that can combine the power of neural networks and label propagation. This work generalises previous literature on graph-augmented training of neural networks, enabling it to be applied to multiple neural architectur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
11
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 67 publications
(11 citation statements)
references
References 10 publications
0
11
0
Order By: Relevance
“…By incorporating adversarial examples into the training procedure of the neural network, the robustness of the model was proven to be increased. Bui et al [24] propose a training framework containing a graph-regularized objective, which achieved state-of-the art performance for multi-label classification on social graphs, news categorization, semantic intent classification, and document classification. Da-Cheng et al [37] show a graph-learning framework that allows to obtain embeddings capable of discriminating between 40 million semantic labels.…”
Section: Neural Structured Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…By incorporating adversarial examples into the training procedure of the neural network, the robustness of the model was proven to be increased. Bui et al [24] propose a training framework containing a graph-regularized objective, which achieved state-of-the art performance for multi-label classification on social graphs, news categorization, semantic intent classification, and document classification. Da-Cheng et al [37] show a graph-learning framework that allows to obtain embeddings capable of discriminating between 40 million semantic labels.…”
Section: Neural Structured Learningmentioning
confidence: 99%
“…Neural Structured Learning (NSL) leverages structured signals along with feature input for training neural networks. NSL generalizes to Adversarial Learning when neighbors are generated by adversarial perturbation [23], or to Neural Graph Learning when the neighbors are explicitly represented with a graph [24].…”
Section: Introductionmentioning
confidence: 99%
“…It can be utilized for training taking the advantage of structured signals related to the feature inputs. NSL is a neural graph learning approach to train neural networks depending on graphs and structured data [30]. NSL also generalizes basic adversarial learning [31] utilizing the structured data with good relational information among the samples.…”
Section: Introductionmentioning
confidence: 99%
“…The first one is by using neural graph learning where neighbors are connected by a graph. The second one is by utilizing adversarial learning where the neighbors are induced by the adversarial perturbation [30].…”
mentioning
confidence: 99%
“…This technique is known as semi-supervised learning and has lately been successfully applied in different medical image analyses 42 . Examples of semi-supervised learning are self-learning 43,44 and neural graph learning 45 , which both make use of unlabeled data in addition to a small number of labeled data to extract additional information 43,44,46 . We believe these new algorithms might be the development needed to make AI even more useful for medical applications.…”
mentioning
confidence: 99%