2019
DOI: 10.48550/arxiv.1912.10000
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Probability Calibration for Knowledge Graph Embedding Models

Pedro Tabacof,
Luca Costabello

Abstract: Knowledge graph embedding research has overlooked the problem of probability calibration. We show popular embedding models are indeed uncalibrated. That means probability estimates associated to predicted triples are unreliable. We present a novel method to calibrate a model when ground truth negatives are not available, which is the usual case in knowledge graphs. We propose to use Platt scaling and isotonic regression alongside our method. Experiments on three datasets with ground truth negatives show our co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 16 publications
(23 reference statements)
0
2
0
Order By: Relevance
“…Pouya Pezeshkpour et al [178] reconsidered and researched the current issues with evaluation metrics and clarified that the currently adopted techniques do not assess KGC, are hard to use for calibration, and cannot reliably differentiate between various models. Calibration is additionally a vital part of KGC that has as of late got consideration [179]. Safavi et al [180] show that calibration procedures can altogether diminish the alignment error of KGE models in the downstream tasks.…”
Section: Comparative Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Pouya Pezeshkpour et al [178] reconsidered and researched the current issues with evaluation metrics and clarified that the currently adopted techniques do not assess KGC, are hard to use for calibration, and cannot reliably differentiate between various models. Calibration is additionally a vital part of KGC that has as of late got consideration [179]. Safavi et al [180] show that calibration procedures can altogether diminish the alignment error of KGE models in the downstream tasks.…”
Section: Comparative Analysismentioning
confidence: 99%
“…Model calibration needs dependable certainty estimation and successful alignment strategies to fix the calibration mistakes. As far as KGE is concerned, two certainty or confidence estimation techniques, SigmoidMax (SIG) and TopKSoftmax (TOP), are exploited in [179] [180].…”
Section: Comparative Analysismentioning
confidence: 99%