2017
DOI: 10.48550/arxiv.1703.06103
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Modeling Relational Data with Graph Convolutional Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
122
1

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 69 publications
(125 citation statements)
references
References 0 publications
2
122
1
Order By: Relevance
“…include in a particular query. Notably, our results differ from those of Schlichtkrull et al (2017), who found performance decreases with increasing node degree in a graph convolution-based model, indicating that our approach is promising to pursue in graphs with high mean node degree.…”
Section: Scaling Propertiescontrasting
confidence: 99%
See 1 more Smart Citation
“…include in a particular query. Notably, our results differ from those of Schlichtkrull et al (2017), who found performance decreases with increasing node degree in a graph convolution-based model, indicating that our approach is promising to pursue in graphs with high mean node degree.…”
Section: Scaling Propertiescontrasting
confidence: 99%
“…The approach is inspired by computational modeling of biological neural architectures for knowledge representation (Crawford et al, 2015), and is related to KBC methods based on convolution of graph neighborhoods (Schlichtkrull et al, 2017;Dettmers et al, 2018;Nguyen et al, 2018), in which inference is performed over representations of aggregated entity neighborhoods. Recent work has extended this idea using Graph Attention Networks (GANs) (Nathani et al, 2018), which assign attention weights to entries in a graph neighborhood, these being later combined.…”
Section: Introductionmentioning
confidence: 99%
“…Several approaches focus on learning embeddings that leverage structured graph representations like Word-Net [18] to capture the semantics represented explicitly in such resources. Among such approaches, TransE [19], HolE [20], RDF2Vec [21], Graph Convolutions [22] and ComE+ [23] (see [24] for a recent review) directly encode the representations contained in the knowledge graph, which is typically an already condensed and filtered version of real-world data. A related research direction, with approaches like NASARI [25], SW2V [26] and Vecsigrafo [12], proposes to co-train word and concept embeddings based not only on knowledge graphs, but also on text corpora through lexical specificity or word-sense disambiguation.…”
Section: Related Workmentioning
confidence: 99%
“…delft is not the first to use graph neural networks [29,47,48, inter alia] for question answering. entity-gcn [13], dfgn [43], and hde [52] build the entity graph with entity co-reference and cooccurrence in documents and apply gnn to the graph to rank the top entity as the answer.…”
Section: Graph Network For Qamentioning
confidence: 99%