2022
DOI: 10.3390/e24060846
|View full text |Cite
|
Sign up to set email alerts
|

Reliable Semantic Communication System Enabled by Knowledge Graph

Abstract: Semantic communication is a promising technology used to overcome the challenges of large bandwidth and power requirements caused by the data explosion. Semantic representation is an important issue in semantic communication. The knowledge graph, powered by deep learning, can improve the accuracy of semantic representation while removing semantic ambiguity. Therefore, we propose a semantic communication system based on the knowledge graph. Specifically, in our system, the transmitted sentences are converted in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
17
0
1

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 35 publications
(19 citation statements)
references
References 44 publications
1
17
0
1
Order By: Relevance
“…p(0) = 0 corroborate -whenever σ 2 → ∞ and P s max → 0 -that η(s, ŝ) = 0, which affirms maximum semantic dissimilarity or semantic irrelevance [31]. Consequently, Theorem 1 translates to the following remarks.…”
Section: Performance Limits Of Deepscsupporting
confidence: 58%
“…p(0) = 0 corroborate -whenever σ 2 → ∞ and P s max → 0 -that η(s, ŝ) = 0, which affirms maximum semantic dissimilarity or semantic irrelevance [31]. Consequently, Theorem 1 translates to the following remarks.…”
Section: Performance Limits Of Deepscsupporting
confidence: 58%
“…where 0 ≤ η(ŝ, s) ≤ 1 and B Φ (•) denotes the output of BERT, which is an enormous pre-trained model that encompasses billions of parameters used for mining semantic information [46]. As defined in (13), the metric η(s, ŝ) takes values between 0 and 1 (which mirror semantic irrelevance and semantic consistency, respectively) [174]. Meanwhile, since BERT are sensitive to polysemy, semantic information is quantified by the sentence similarity metric at the sentence level [135].…”
Section: E Semantic Similarity Metricmentioning
confidence: 99%
“…In this framework, semantic information is represented by a KG made up of a set of semantic triples and the receiver recovers the original text using a graph-to-text generation model. Another SemCom system based on the KG is proposed in [30]. In this system, transmitted sentences are converted into triplets using the KG, which are seen as fundamental semantic symbols for semantic extraction and restoration, and they are ordered based on semantic relevance.…”
Section: Related Workmentioning
confidence: 99%
“…Similarly, a method to generate a summary of sentences by using a knowledge base is shown in [35]. Recently, KGs are utlized in the context of SemCom design [29], [30], [36], [37]. But these works do not focus on the issue presented in this paper, which is to design a SemCom system with a significant overhead reduction with a little compromise on accuracy.…”
Section: Related Workmentioning
confidence: 99%