The brain excels at processing sensory input, even in rich or chaotic environments. Mounting evidence attributes this to the creation of sophisticated internal models of the environment that draw on statistical structures in the unfolding sensory input. Understanding how and where this modeling takes place is a core question in statistical learning. It is unknown how this modeling applies to random sensory signals. Here, we identify conditional relations, through transitional probabilities, as an implicit structure supporting the encoding of a random auditory stream. We evaluate this representation using intracranial electroencephalography recordings by applying information-theoretical principles to high-frequency activity (75-145 Hz). We demonstrate how the brain continuously encodes conditional relations between random stimuli in a network outside of the auditory system following a hierarchical organization including temporal, frontal and hippocampal regions. Our results highlight that hierarchically organized brain areas continuously attempt to order incoming information by maintaining a probabilistic representation of the sensory input, even under random stimuli presentation.
ist das der heil'ge Bronnen, Woraus ein Trunk den Durst auf ewig stillt?Erquickung hast du nicht gewonnen, Wenn sie dir nicht aus eigner Seele quillt. PrefaceThis thesis is submitted in partial fulfilment of the requirements for the degree of Philosophiae Doctor at the University of Oslo. The research presented was conducted at the University of Oslo under the supervision of Kyrre Glette, Alejandro Blenkmann and Tor Endestad. This work was partly supported by the Research Council of Norway (RCN) through its Centres of Excellence scheme project number 262762, RCN project number 240389 and 314925.The thesis is a collection of three papers presented in chronological order of writing. The common theme is the investigation of neurophysiological recordings through information-theoretical principles. The papers are preceded by an introductory chapter that relates them to each other and chapters providing background information and motivation for the work. After presenting the papers, a discussion aiming to set the papers into a bigger context follows.
Being the most sophisticated information processing apparatus known to us, understanding the brain offers great possibilities. A viable candidate to advance the understanding of this cortical information processing machine is information theory. With its universal applicability, it enables the modeling of complex systems, is free of any requirements about the data structure and can infer the present underlying brain mechanisms. Algorithmic information theory seems particularly suitable since it can estimate the information contained in individual brain responses. Here, we propose a measure grounded in algorithmic information theory termed Encoded Information as a novel approach to analyze neurophysiological recordings. Specifically, it enables an assessment of the encoded information mean responses share with one another by compressing the respective signals. By applying the approach to five cortical activity types originating from intracranial electroencephalography recordings of humans and marmoset monkeys, we demonstrate that the information-based encoding can compete with conventional approaches such as the t-test or mutual information. Information-based encoding is attractive whenever one is interested in detecting where in the brain the neural responses differ across experimental conditions.
Brain activity differs vastly between sleep, cognitive tasks, and action. Information theory is an appropriate concept to analytically quantify these brain states. Based on neurophysiological recordings, this concept can handle complex data sets, is free of any requirements about the data structure, and can infer the present underlying brain mechanisms. Specifically, by utilizing algorithmic information theory, it is possible to estimate the absolute information contained in brain responses. While current approaches that apply this theory to neurophysiological recordings can discriminate between different brain states, they are limited in directly quantifying the degree of similarity or encoded information between brain responses. Here, we propose a method grounded in algorithmic information theory that affords direct statements about responses' similarity by estimating the encoded information through a compression-based scheme. We validated this method by applying it to both synthetic and real neurophysiological data and compared its efficiency to the mutual information measure. This proposed procedure is especially suited for task paradigms contrasting different event types because it can precisely quantify the similarity of neuronal responses.
Information theory is a viable candidate to advance our understanding of how the brain processes information generated in the internal or external environment. With its universal applicability, information theory enables the analysis of complex data sets, is free of requirements about the data structure, and can help infer the underlying brain mechanisms. Information-theoretical metrics such as Entropy or Mutual Information have been highly beneficial for analyzing neurophysiological recordings. However, a direct comparison of the performance of these methods with well-established metrics, such as the t-test, is rare. Here, such a comparison is carried out by evaluating the novel method of Encoded Information with Mutual Information, Gaussian Copula Mutual Information, Neural Frequency Tagging, and t-test. We do so by applying each method to event-related potentials and event-related activity in different frequency bands originating from intracranial electroencephalography recordings of humans and marmoset monkeys. Encoded Information is a novel procedure that assesses the similarity of brain responses across experimental conditions by compressing the respective signals. Such an information-based encoding is attractive whenever one is interested in detecting where in the brain condition effects are present.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.