2010
DOI: 10.3390/e12071765
|View full text |Cite
|
Sign up to set email alerts
|

Entropy and Information Approaches to Genetic Diversity and its Expression: Genomic Geography

Abstract: This article highlights advantages of entropy-based genetic diversity measures, at levels from gene expression to landscapes. Shannon's entropy-based diversity is the standard for ecological communities. The exponentials of Shannon's and the related "mutual information" excel in their ability to express diversity intuitively, and provide a generalised method of considering microscopic behaviour to make macroscopic predictions, under given conditions. The hierarchical nature of entropy and information allows in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
99
0
1

Year Published

2011
2011
2019
2019

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 92 publications
(100 citation statements)
references
References 132 publications
(206 reference statements)
0
99
0
1
Order By: Relevance
“…Shannon’s information index is a diversity measure that takes into consideration the frequency of each allele in addition to the total number of alleles [43], [44]. We also assigned individuals to putative populations based on the expected frequencies of their genotypes in those populations using a “leave one out” option in GenAlEx 6.5 [45], [46].…”
Section: Methodsmentioning
confidence: 99%
“…Shannon’s information index is a diversity measure that takes into consideration the frequency of each allele in addition to the total number of alleles [43], [44]. We also assigned individuals to putative populations based on the expected frequencies of their genotypes in those populations using a “leave one out” option in GenAlEx 6.5 [45], [46].…”
Section: Methodsmentioning
confidence: 99%
“…The seminal work of Shannon on Information Theory [1] gave rise to the concept of Mutual Information (MI) [2] as a measure of probabilistic dependence among random variables (RVs), with a broad range of applications, including neuroscience [3], communications and engineering [4], physics, statistics, economics [5], genetics [6], linguistics [7] and geosciences [8]. MI is the positive difference between two Shannon entropies of the RVs: the one assuming statistical independence ( ) , obtained after subtraction to the sum of fixed marginal entropies of the maximum joint entropy (ME) max H , compatible with imposed cross constraints.…”
Section: The State Of the Artmentioning
confidence: 99%
“…The number of independent and cross moments of j T (13) is 2j and ( 1)/ 2 j j  respectively (e.g. (4,1), (8,6), (12,15) and (16,28), for j=2, 4,6,8). Other more efficient basis cross functions could be used as for example orthogonal polynomials.…”
Section: Gaussian and Non-gaussian MImentioning
confidence: 99%
“…This effect would be compensated in part through adaptation of the species, quantified as genetic information. Therefore, reduction of biomass in a sub-system is sometimes compensated with an increment of information present in the genes (Sherwin, 2010). The presence of soluble salts in soils generates adverse conditions for unadapted species, reducing the total amount of biomass in the sub-system and, therefore, its exergy.…”
Section: Succession Dynamicsmentioning
confidence: 99%