2022
DOI: 10.1073/pnas.2123152119
|View full text |Cite
|
Sign up to set email alerts
|

Accumulation and maintenance of information in evolution

Abstract: Selection accumulates information in the genome—it guides stochastically evolving populations toward states (genotype frequencies) that would be unlikely under neutrality. This can be quantified as the Kullback–Leibler (KL) divergence between the actual distribution of genotype frequencies and the corresponding neutral distribution. First, we show that this population-level information sets an upper bound on the information at the level of genotype and phenotype, limiting how precisely they can be specified by… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 56 publications
0
9
0
Order By: Relevance
“…Kimura calculates the amount of information acquired per sweep in terms of 𝑝 O and then relates this to the cost of selection using Haldane’s equation that 𝐷 = - ln ( 𝑝 O ) (Kimura 1961). More recent approaches treat the bounds placed on the accumulation of information in much more detail, while treating either the Malthusian parameter (McGee et al 2022) or classic discrete time relative fitness (Hledík, Barton, and Tkačik 2022). Both these approaches define an information “cost”, but this is not equal to our cost in terms of selective deaths.…”
Section: Discussionmentioning
confidence: 99%
“…Kimura calculates the amount of information acquired per sweep in terms of 𝑝 O and then relates this to the cost of selection using Haldane’s equation that 𝐷 = - ln ( 𝑝 O ) (Kimura 1961). More recent approaches treat the bounds placed on the accumulation of information in much more detail, while treating either the Malthusian parameter (McGee et al 2022) or classic discrete time relative fitness (Hledík, Barton, and Tkačik 2022). Both these approaches define an information “cost”, but this is not equal to our cost in terms of selective deaths.…”
Section: Discussionmentioning
confidence: 99%
“…Our proposal to use the LDF for currents as a cost measure to tie physiological events to selective events is driven by two considerations: First, relative entropies [122124] and a variety of functions either derived from [125, 126] or related to [127] them are understood as measures of the both information and cost [31, 118, 120, 121, 124] through selection in population processes. Second, the events within physiology form a nested partition function with the population-level events of reproduction or death through which selection acts in the form of differential rates.…”
Section: The Information Associated With Large Deviations At Single I...mentioning
confidence: 99%
“…The term j(i) is the probability that a sub-word is in an alternative DNA conformation, where 0 ≤ j(i) ≤ 1, and varies by context. Another possible approach to capturing the coding capacity of a genome is based on calculating the Kullback-Leibler divergence from a reference genome [36][37][38][39][40]. There is, however, no adjustment for the flipon-mediated effects on coding due to changes in isoform usage, transcript editing, and RNA modification.…”
Section: Directed Cycles and Computationmentioning
confidence: 99%