2019
DOI: 10.3390/e21090881
|View full text |Cite
|
Sign up to set email alerts
|

The Poincaré-Shannon Machine: Statistical Physics and Machine Learning Aspects of Information Cohomology

Abstract: This paper presents the computational methods of information cohomology applied to genetic expression in [125,16,17] and in the companion paper [16] and proposes its interpretations in terms of statistical physics and machine learning. In order to further underline the Hochschild cohomological nature af information functions and chain rules, following [13,14,133], the computation of the cohomology in low degrees is detailed to show more directly that the k multivariate mutual-informations (I k ) are k-cobounda… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
19
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 14 publications
(19 citation statements)
references
References 166 publications
(378 reference statements)
0
19
0
Order By: Relevance
“…The main theorems, definitions and data analysis [147,134,133] establish the following results, here included with comments about their relevance regarding consciousness and neural processing theories.…”
Section: Information Topology Synthesis: Consciousness's Complexes Anmentioning
confidence: 97%
See 3 more Smart Citations
“…The main theorems, definitions and data analysis [147,134,133] establish the following results, here included with comments about their relevance regarding consciousness and neural processing theories.…”
Section: Information Topology Synthesis: Consciousness's Complexes Anmentioning
confidence: 97%
“…It appears by direct computation in this cohomology that mutual informations with an odd number of variables are minus the coboundary of even degrees ∂ 2k F = −I 2k+1 . Obtaining even mutual informations is achieved by introducing a second coboundary with either trivial or symmetric action [132,89,133], giving the even mutual-informations as minus the odd symmetric coboundaries ∂ 2k−1 * F = −I 2k . The independence of two variables (I 2 = 0) is then directly generalized to k-variables and gives the cocycles I k = 0 [134].…”
Section: Information Topology Synthesis: Consciousness's Complexes Anmentioning
confidence: 99%
See 2 more Smart Citations
“…The approach by Baez et al shows that a particular class of functors from the category FinStat, which is a finite set equipped with a probability distribution, are scalar multiples of the entropy [15]. The papers by Baudot et al [16][17][18] also take a category theoretical approach however their results are more focused on the topological properties of information theoretic quantities. Both Baez et al and Baudot et al discuss various information theoretic measures such as the relative entropy, mutual information, total correlation, and others.…”
Section: Introductionmentioning
confidence: 99%