2011
DOI: 10.3390/e13030683
|View full text |Cite
|
Sign up to set email alerts
|

On a Connection between Information and Group Lattices

Abstract: In this paper we review a particular connection between information theory and group theory. We formalize the notions of information elements and information lattices, first proposed by Shannon. Exploiting this formalization, we expose a comprehensive parallelism between information lattices and subgroup lattices. Qualitatively, isomorphisms between information lattices and subgroup lattices are demonstrated. Quantitatively, a decisive approximation relation between the entropy structures of information lattic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 16 publications
(22 citation statements)
references
References 39 publications
(54 reference statements)
0
22
0
Order By: Relevance
“…' In' optical' and' condensed' matter' physics,' researchers' have' used' the' Shannon' entropy' variation' to' assess' the' stability' of' a' single' photon' based' quantum' cryptography' protocol' [216],'to'determine'the'probability'of'electronic'charge'distribution'between'the'atoms'of'a' benzene' ring' (also' named' aromaticity' measure)' [217],' and' to' evaluate' the' electron' localization'in'a'molecular'system' [218]. 'Moreover,'a'parallelism'in'condensed'matter'physics' among'information'lattice'and'subgroup'lattice'has'been'reported' [219].' In' two' dimensional' photonic' structures,' it' has' been' demonstrated' the' normalized' total' transmission'Tnt'can'be'related'to'the'Shannon'index'H''approximately'with'this'simple'linear' behaviour' (17)' ' that'shows'an'increase'of'the'normalized'total'transmission'by'increasing'the'Shannon'index' [220,221].…”
Section: Figure( 11'transmission'spectrum'of'a'1d'disordered'photonimentioning
confidence: 99%
“…' In' optical' and' condensed' matter' physics,' researchers' have' used' the' Shannon' entropy' variation' to' assess' the' stability' of' a' single' photon' based' quantum' cryptography' protocol' [216],'to'determine'the'probability'of'electronic'charge'distribution'between'the'atoms'of'a' benzene' ring' (also' named' aromaticity' measure)' [217],' and' to' evaluate' the' electron' localization'in'a'molecular'system' [218]. 'Moreover,'a'parallelism'in'condensed'matter'physics' among'information'lattice'and'subgroup'lattice'has'been'reported' [219].' In' two' dimensional' photonic' structures,' it' has' been' demonstrated' the' normalized' total' transmission'Tnt'can'be'related'to'the'Shannon'index'H''approximately'with'this'simple'linear' behaviour' (17)' ' that'shows'an'increase'of'the'normalized'total'transmission'by'increasing'the'Shannon'index' [220,221].…”
Section: Figure( 11'transmission'spectrum'of'a'1d'disordered'photonimentioning
confidence: 99%
“…Given random variables X and Y , we write X Y to signify that there exists a measurable function, f , such that X = f (Y ) almost surely (i.e., with probability one). In this case, following the terminology in [8], we say that X is informationally poorer than Y ; this induces a partial order on the set of random variables. Similarly, we write X Y if Y X, in which case we say X is informationally richer than Y .…”
Section: Informational Partial Order and Equivalencementioning
confidence: 99%
“…If X and Y are such that X Y and X Y , then we write X ∼ = Y . In this case, again following [8], we say that X and Y are informationally equivalent. In other words, X ∼ = Y if and only if one can relabel the values of X to obtain a random value that is equal to Y almost surely and vice versa.…”
Section: Informational Partial Order and Equivalencementioning
confidence: 99%
See 1 more Smart Citation
“…Given an underlying probability space, there exists a natural one-to-one mapping between sample-space partitions and σ-algebras. This implies we can partition the set of all RVs into disjoint equivalence classes, called information elements, s.t., all RVs within a given class are informationally equivalent [17], [18]. Shannon then defined a relation of inclusion between two such information elements: we say that Y (X) is informationally richer…”
Section: Information Structures and The Lattice Of Information σ-Algementioning
confidence: 99%