Computer Science &Amp; Information Technology (CS &Amp; IT) 2020
DOI: 10.5121/csit.2020.101519
|View full text |Cite
|
Sign up to set email alerts
|

Negative Sampling in Knowledge Representation Learning: A Mini-Review

Abstract: Knowledge representation learning (KRL) aims at encoding components of a knowledge graph (KG) into a low-dimensional continuous space, which has brought considerable successes in applying deep learning to graph embedding. Most famous KGs contain only positive instances for space efficiency. Typical KRL techniques, especially translational distance-based models, are trained through discriminating positive and negative samples. Thus, negative sampling is unquestionably a non-trivial step in KG embedding. The qua… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 86 publications
0
1
0
Order By: Relevance
“…Here s' and o' represents corrupted subject and object. It is worth noting to mention some negative sampling techniques [123] such as uniform sampling [31], Bernoulli sampling [32], KBGAN [124], IGAN [125], NSCaching [126].…”
Section: Negatives Generationmentioning
confidence: 99%
“…Here s' and o' represents corrupted subject and object. It is worth noting to mention some negative sampling techniques [123] such as uniform sampling [31], Bernoulli sampling [32], KBGAN [124], IGAN [125], NSCaching [126].…”
Section: Negatives Generationmentioning
confidence: 99%