2021
DOI: 10.1007/s10489-021-02438-8
|View full text |Cite
|
Sign up to set email alerts
|

Novel translation knowledge graph completion model based on 2D convolution

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(7 citation statements)
references
References 14 publications
0
7
0
Order By: Relevance
“…The experiments in this paper evaluate the JointMC model on the FB15k237 and WN18RR datasets. Several geometric and tensor decomposition-based models, such as TransE, DistMult, ComplEx, etc., and several convolution-based models, such as ConvE, HypER, and CTKGC [23] , were selected as baseline models for comparison. The results obtained by comparing the evaluation metrics of each model on the dataset are shown in Table4.…”
Section: Link Prediction Resultsmentioning
confidence: 99%
“…The experiments in this paper evaluate the JointMC model on the FB15k237 and WN18RR datasets. Several geometric and tensor decomposition-based models, such as TransE, DistMult, ComplEx, etc., and several convolution-based models, such as ConvE, HypER, and CTKGC [23] , were selected as baseline models for comparison. The results obtained by comparing the evaluation metrics of each model on the dataset are shown in Table4.…”
Section: Link Prediction Resultsmentioning
confidence: 99%
“…Additionally, we will compare them with state-of-the-art and representative Euclidean and complex space embedding models that have proposed knowledge graph reasoning methods, including TransE [21], DistMult [26], MuRE [36], TuckER [28], ConvE [29], ConvKB [30], and KBGAT [33] for Euclidean space, and ComplEx [27], RotatE [23], and ComplexGCN [32] for complex space. Furthermore, we will consider graph neural network based models and recent models with outstanding predictive performance, such as R-GCN [31], MRGAT [34], and GTKGC [43]. In total, 17 models will be used as baseline algorithms for comparison with our proposed model, and the experimental data for baseline algorithms will be selected based on the experimental results from the original literature.…”
Section: Baselinesmentioning
confidence: 99%
“…Inspired by ConEx [17] and CTKGC [18] models, we named the proposed model ConCMH. ConCMH includes an embedding layer, interaction layer, feature extraction layer, and output layer, as shown in Figure 2.…”
Section: Proposed Modelmentioning
confidence: 99%
“…where * denotes the convolution operation, ω denotes the convolution kernel, the outermost f (•) denotes the linear unit function (ReLU), and the inner f (•) denotes the interactive operation of the head entity embedding h e and relation embedding r e . To speed up the model convergence, we set the convolution kernel to 3 × 200, taking into account the experience obtained with the CTKGC model [18].…”
Section: Feature Extraction Layermentioning
confidence: 99%
See 1 more Smart Citation