2011
DOI: 10.21236/ada544137
|View full text |Cite
|
Sign up to set email alerts
|

Towards a Theory of Semantic Communication (Extended Technical Report)

Abstract: Abstract-This paper studies methods of quantitatively measuring semantic information in communication. We review existing work on quantifying semantic information, then investigate a model-theoretical approach for semantic data compression and reliable semantic communication. We relate our approach to the statistical measurement of information by Shannon, and show that Shannon's source and channel coding theorems have semantic counterparts.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
17
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(17 citation statements)
references
References 42 publications
0
17
0
Order By: Relevance
“…That has the advantage of being unambiguous, but this constraint might seem inappropriate in dealing with natural languages given the nebulous meaning of many words. However, it accords with the definition of sign: when a sign vehicle has more than one mapping to a domain, each determines a different immediate object, or the variants may constitute different signs [110,111]. A morphism may represent a domain or it may project an expectation of effects, such as a disposition of a sign user to react to a sign vehicle in a particular way; for example, a soldier responding to a command or a computer processing a type of input.…”
Section: Semioticsmentioning
confidence: 98%
“…That has the advantage of being unambiguous, but this constraint might seem inappropriate in dealing with natural languages given the nebulous meaning of many words. However, it accords with the definition of sign: when a sign vehicle has more than one mapping to a domain, each determines a different immediate object, or the variants may constitute different signs [110,111]. A morphism may represent a domain or it may project an expectation of effects, such as a disposition of a sign user to react to a sign vehicle in a particular way; for example, a soldier responding to a command or a computer processing a type of input.…”
Section: Semioticsmentioning
confidence: 98%
“…The first theory of semantic information was proposed by Carnap and Bar-Hillel in the early 1950s and was based on logical probabilities [174], [175]. They used logical probabilities (as opposed to the statistical probabilities used in the Shannon information theory) over the content of a sentence to quantify the amount of information in a sentence in a given language [95], [176]. In their theory, information is perceived as a set of excluded possibilities [177], and a sentence's logical probability is measured by the likelihood that it would be true in all possible situations [176].…”
Section: Theories Of Semantic Informationmentioning
confidence: 99%
“…They used logical probabilities (as opposed to the statistical probabilities used in the Shannon information theory) over the content of a sentence to quantify the amount of information in a sentence in a given language [95], [176]. In their theory, information is perceived as a set of excluded possibilities [177], and a sentence's logical probability is measured by the likelihood that it would be true in all possible situations [176]. To this end, Carnap and Bar-Hillel's semantic information theory (SIT) asserts that "A and B" provide more information than "A" or "B" (since "A and B" is less likely to be true), "A" provides more information than "A or B", and a tautology (which is always true) provides no information [176].…”
Section: Theories Of Semantic Informationmentioning
confidence: 99%
See 1 more Smart Citation
“… Although, e.g., Weaver (1949) suspected that the aspect of meaning in principle could be included and various attempts have been made to do so(Bao et al, 2011).…”
mentioning
confidence: 99%