2017
DOI: 10.1088/1742-5468/aa71d4
|View full text |Cite
|
Sign up to set email alerts
|

What we learn from the learning rate

Abstract: The learning rate is an information-theoretical quantity for bipartite Markov chains describing two coupled subsystems. It is defined as the rate at which transitions in the downstream subsystem tend to increase the mutual information between the two subsystems, and is bounded by the dissipation arising from these transitions. Its physical interpretation, however, is unclear, although it has been used as a metric for the sensing performance of the downstream subsystem. In this paper, we explore the behaviour o… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
25
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(26 citation statements)
references
References 46 publications
1
25
0
Order By: Relevance
“…is a learning rate which quantifies the information gained by the system about the current environmental state [17]. The denominator follows from the facts that at steady state − Q = W (due to energy conservation) and W = W diss [7].…”
Section: Resultsmentioning
confidence: 99%
“…is a learning rate which quantifies the information gained by the system about the current environmental state [17]. The denominator follows from the facts that at steady state − Q = W (due to energy conservation) and W = W diss [7].…”
Section: Resultsmentioning
confidence: 99%
“…287,288 At steady state this nostalgia (for unit-time steps) equals the learning rate, the rate at which a system (due to its own dynamics) increases its mutual information with the environment. [289][290][291][292][293] Intuitively, this says that thermodynamic efficiency is accomplished by forgetting (i.e., rapidly randomizing and hence relaxing to equilibrium with respect to) degrees of freedom in one's environment that are not predictive of future fluctuations. Possible implications for efficient transduction of environmental fluctuations by molecular machines are that they must ignore the myriad aspects of their environment (e.g., sundry collisions from water molecules) that have no bearing on mechanically meaningful future environmental fluctuations.…”
Section: Information Machinesmentioning
confidence: 99%
“…To characterize the directional information flow from one system to the other, we now consider the two informational quantities: the learning rate [11,16,17,19,26] and the transfer entropy [5-7, 10, 16, 17]. The learning rate has recently been introduced in Ref.…”
Section: Setupmentioning
confidence: 99%
“…In particular, thermodynamics of autonomous measurement and feedback has been developed [4][5][6][7][8][9][10][11][12][13][14][15][16][17] and applied to biological systems [16][17][18][19][20][21][22][23][24], where the concept of continuous information flow has played a significant role. Specifically, the transfer entropy [4-7, 10, 16, 17, 25] and the learning rate [4,11,16,17,19,26] have been shown related to the second law of thermodynamics. The ratio of these two informational quantities is referred to as the sensory capacity [16,17], which is argued to be a measure of the effectiveness of stochastic sensors.…”
Section: Introductionmentioning
confidence: 99%