Abstract:We propose that entropy is a universal co-homological class in a theory associated to a family of observable quantities and a family of probability distributions. Three cases are presented: (1) classical probabilities and random variables; (2) quantum probabilities and observable operators; (3) dynamic probabilities and observation trees. This gives rise to a new kind of topology for information processes, that accounts for the main information functions: entropy, mutual-informations at all orders, and Kullback-Leibler divergence and generalizes them in several ways. The article is divided into two parts, that can be read independently. In the first part, the introduction, we provide an overview of the results, some open questions, future results and lines of research, and discuss briefly the application to complex data. In the second part we give the complete definitions and proofs of the theorems A, C and E in the introduction, which show why entropy is the first homological invariant of a structure of information in four contexts: static classical or quantum probability, dynamics of classical or quantum strategies of observation of a finite system.
Abstract. We present and study a homographic best approximation problem, which arises in the analysis of waveform relaxation algorithms with optimized transmission conditions. Its solution characterizes in each class of transmission conditions the one with the best performance of the associated waveform relaxation algorithm. We present the particular class of first order transmission conditions in detail and show that the new waveform relaxation algorithms are well posed and converge much faster than the classical one: the number of iterations to reach a certain accuracy can be orders of magnitudes smaller. We illustrate our analysis with numerical experiments.
We introduce a mathematical model of embodied consciousness, the Projective Consciousness Model (PCM), which is based on the hypothesis that the spatial field of consciousness (FoC) is structured by a projective geometry and under the control of a process of active inference. The FoC in the PCM combines multisensory evidence with prior beliefs in memory and frames them by selecting points of view and perspectives according to preferences. The choice of projective frames governs how expectations are transformed by consciousness. Violations of expectation are encoded as free energy. Free energy minimization drives perspective taking, and controls the switch between perception, imagination and action. In the PCM, consciousness functions as an algorithm for the maximization of resilience, using projective perspective taking and imagination in order to escape local minima of free energy. The PCM can account for a variety of psychological phenomena: the characteristic spatial phenomenology of subjective experience, the distinctions and integral relationships between perception, imagination and action, the role of affective processes in intentionality, but also perceptual phenomena such as the dynamics of bistable figures and body swap illusions in virtual reality. It relates phenomenology to function, showing the computational advantages of consciousness. It suggests that changes of brain states from unconscious to conscious reflect the action of projective transformations and suggests specific neurophenomenological hypotheses about the brain, guidelines for designing artificial systems, and formal principles for psychology.
This paper presents methods that quantify the structure of statistical interactions within a given data set, and was first used in [58]. It establishes new results on the k-multivariate mutual-informations (I k ) inspired by the topological formulation of Information introduced in [4,63]. In particular we show that the vanishing of all I k for 2 ≤ k ≤ n of n random variables is equivalent to their statistical independence. Pursuing the work of Hu Kuo Ting and Te Sun Han [23,21,22], we show that information functions provide co-ordinates for binary variables, and that they are analytically independent on the probability simplex for any set of finite variables. The maximal positive I k identifies the variables that co-vary the most in the population, whereas the minimal negative I k identifies synergistic clusters and the variables that differentiate-segregate the most the population. Finite data size effects and estimation biases severely constrain the effective computation of the information topology on data, and we provide simple statistical tests for the undersampling bias and the k-dependences following [43]. We give an example of application of these methods to genetic expression and unsupervised cell-type classification. The methods unravel biologically relevant subtypes, with a sample size of 41 genes and with few errors. It establishes generic basic methods to quantify the epigenetic information storage and a unified epigenetic unsupervised learning formalism. We propose that higher-order statistical interactions and non identically distributed variables are constitutive characteristics of biological systems that should be estimated in order to unravel their significant statistical structure and diversity. The topological information data analysis presented here allows to precisely estimate this higher-order structure characteristic of biological systems."When you use the word information, you should rather use the word form"René Thom
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.