2008
DOI: 10.48550/arxiv.0801.4790
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Information Width

Joel Ratsaby

Abstract: Kolmogorov argued that the concept of information exists also in problems with no underlying stochastic model (as Shannon's information representation) for instance, the information contained in an algorithm or in the genome. He introduced a combinatorial notion of entropy and information I(x : y) conveyed by a binary string x about the unknown value of a variable y. The current paper poses the following questions: what is the relationship between the information conveyed by x about y to the description comple… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2009
2009
2010
2010

Publication Types

Select...
2

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 19 publications
0
4
0
Order By: Relevance
“…The description complexity of the object is the minimal length of a string that describes the object. The distance of [7] is based on the idea that two sets should be considered similar if given the knowledge of one the additional complexity in describing an element of the other set is small (this is also referred to as the conditional combinatorial entropy, see [5,6] and references therein). The advantage in this formulation of distance is its universality, i.e., it can be applied without any prior knowledge or assumption about the domain of interest, i.e., the elements that the sets contain.…”
Section: Information Based Distancesmentioning
confidence: 99%
See 1 more Smart Citation
“…The description complexity of the object is the minimal length of a string that describes the object. The distance of [7] is based on the idea that two sets should be considered similar if given the knowledge of one the additional complexity in describing an element of the other set is small (this is also referred to as the conditional combinatorial entropy, see [5,6] and references therein). The advantage in this formulation of distance is its universality, i.e., it can be applied without any prior knowledge or assumption about the domain of interest, i.e., the elements that the sets contain.…”
Section: Information Based Distancesmentioning
confidence: 99%
“…The advantage in this formulation of distance is its universality, i.e., it can be applied without any prior knowledge or assumption about the domain of interest, i.e., the elements that the sets contain. Such a distance can be viewed as an information-based distance since the conditional descriptional complexity is essentially the amount of information needed to describe an element in one set given that we know the other set (for more on the notion of combinatorial information and entropy see [5,6]).…”
Section: Information Based Distancesmentioning
confidence: 99%
“…In [15] an alternative view of I(x : Y) is defined as the information that a set Yx conveys about another set Y satisfying Yx ⊆ Y. Here the domain R is defined based on the previous set…”
Section: Entropy and Information Of A Setmentioning
confidence: 99%
“…Thus the second term in ( 5) can be viewed as the conditional combinatorial entropy of Π Y (A) given the set Yx. In [12,13,15] this is used to extend Kolmogorov's combinatorial information to a more general setting where knowledge of x still leaves some vagueness about the possible value of y.…”
Section: Entropy and Information Of A Setmentioning
confidence: 99%