2021
DOI: 10.1007/s00778-020-00640-7
|View full text |Cite
|
Sign up to set email alerts
|

Simple and automated negative sampling for knowledge graph embedding

Abstract: Negative sampling, which samples negative triplets from non-observed ones in knowledge graph (KG), is an essential step in KG embedding. Recently, generative adversarial network (GAN), has been introduced in negative sampling. By sampling negative triplets with large gradients, these methods avoid the problem of vanishing gradient and thus obtain better performance. However, they make the original model more complex and harder to train. In this paper, motivated by the observation that negative triplets with la… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 49 publications
0
6
0
Order By: Relevance
“…iii) Negative sampling methods. We employ 3 different negative sampling methods to compare with our method, including KBGAN [4], MANS [60], MMRNS [54]. Among these methods, KBGAN [4] is an adversarial negative sampling method designed for conventional KGC, which applies reinforcement learning to optimize the models.…”
Section: Baseline Methods For Comparisonsmentioning
confidence: 99%
See 2 more Smart Citations
“…iii) Negative sampling methods. We employ 3 different negative sampling methods to compare with our method, including KBGAN [4], MANS [60], MMRNS [54]. Among these methods, KBGAN [4] is an adversarial negative sampling method designed for conventional KGC, which applies reinforcement learning to optimize the models.…”
Section: Baseline Methods For Comparisonsmentioning
confidence: 99%
“…Among these methods, KBGAN [4] is an adversarial negative sampling method designed for conventional KGC, which applies reinforcement learning to optimize the models. MANS [60] and MMRNS [54] are two negative sampling strategies designed for MMKGC, which utilize the multi-modal information to enhance the negative sampling process. iv) Numeric-aware KGC methods.…”
Section: Baseline Methods For Comparisonsmentioning
confidence: 99%
See 1 more Smart Citation
“…In all cases, a scoring function -implemented by minimizing a loss functionguides the training of the embeddings, which are learned to yield high scores for true facts and low scores for false facts. The latter are obtained by corrupting the true facts in the KG -a task of utter importance for the quality of the embeddings [13,36].…”
Section: Related Workmentioning
confidence: 99%
“…Using the full set can be defined as the 1VsAll (Lacroix et al, 2018) or kVsAll (Dettmers et al, 2017) according to the positive triplets used. The methods (Cai and Wang, 2018;Zhang et al, 2021) requiring additional models for negative sampling are not considered here. and BCE_sum (Trouillon et al, 2017), to classify the positive and negative triplets as binary classes, or use cross entropy (CE) loss (Lacroix et al, 2018) to classify the positive triplet as the true label over the negative triplets.…”
Section: Background: Hps In Kg Embeddingmentioning
confidence: 99%