2022
DOI: 10.1007/s10489-021-03149-w
|View full text |Cite
|
Sign up to set email alerts
|

Improving temporal knowledge graph embedding using tensor factorization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(21 citation statements)
references
References 21 publications
0
21
0
Order By: Relevance
“…The PubChem corpus is used to fine-tune a LLM. 17,24 (2) The Barlow Twins neural network provides a learned representation of molecules in the context of bioassays. 15 It first independently encodes both molecular and textual information and then passes them through a unified projector.…”
Section: ■ Results and Discussionmentioning
confidence: 99%
“…The PubChem corpus is used to fine-tune a LLM. 17,24 (2) The Barlow Twins neural network provides a learned representation of molecules in the context of bioassays. 15 It first independently encodes both molecular and textual information and then passes them through a unified projector.…”
Section: ■ Results and Discussionmentioning
confidence: 99%
“…From the intuition of RESCAL, the researcher proposed various models such as SimplE (Kazemi and Poole, 2018) extends DistMult by separate embedding for associated entity pair ( h , t ) of a triplet with two separate diagonal matrices, dig ( M r ) and dig ( M r ′ ), to express the complex relation type. TNTSimplE (He et al , 2023) extends the work of SimplE to target temporal KG and to capture symmetry, asymmetry and inverse relation type.…”
Section: Related Workmentioning
confidence: 99%
“…From the intuition of RESCAL, the researcher proposed various models such as SimplE (Kazemi and Poole, 2018) extends DistMult by separate embedding for associated entity pair (h, t) of a triplet with two separate diagonal matrices, dig(M r ) and dig(M r 0 ), to express the complex relation type. TNTSimplE (He et al, 2023) 1. The use of KG enables a machine to learn and automate inferring over the cybersecurity domain, such as attacks prediction (Sun et al, 2022), threat prediction (Zhao et al, 2022) and threats analysis (Li et al, 2023).…”
Section: Ijwis 193/4mentioning
confidence: 99%
“…• microsoft/deberta-v3-base (He et al 2022): Deep learning language model that uses the Transformer architecture and has been pre-trained on a large amount of text data. It focuses on understanding the syntactic and semantic structure of natural language and is designed for natural language processing tasks such as sentiment analysis, text classification, and text generation.…”
Section: Models Selectionmentioning
confidence: 99%