2019
DOI: 10.1007/978-3-030-31624-2_6
|View full text |Cite
|
Sign up to set email alerts
|

Simplified Representation Learning Model Based on Parameter-Sharing for Knowledge Graph Completion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(1 citation statement)
references
References 31 publications
0
1
0
Order By: Relevance
“…T is the set of negative triple respect to T defined above, i.e., we need to sample a negative triple (h , r, t ) to compute loss, given a positive triple (h, r, t) ∈ T . We construct a set of negative triples by replacing the head entity h or tail entity t with a random entity uniformly sampled from the knowledge graph G, following previous work [14], [49], [50], which is widely used in many research. Therefore, h ∈ E and t ∈ E indicate the negative head entity and tail entity obtained by random sampling respectively.…”
Section: Knowledge Graph Representation Learningmentioning
confidence: 99%
“…T is the set of negative triple respect to T defined above, i.e., we need to sample a negative triple (h , r, t ) to compute loss, given a positive triple (h, r, t) ∈ T . We construct a set of negative triples by replacing the head entity h or tail entity t with a random entity uniformly sampled from the knowledge graph G, following previous work [14], [49], [50], which is widely used in many research. Therefore, h ∈ E and t ∈ E indicate the negative head entity and tail entity obtained by random sampling respectively.…”
Section: Knowledge Graph Representation Learningmentioning
confidence: 99%