Cognitive functions arise from the coordinated activity of many neural populations. It is challenging to measure how specific aspects of neural dynamics translate into operations of information processing, and, ultimately, cognitive functions. An obstacle is that simple circuit mechanisms -such as self-sustained or propagating activity and nonlinear summation of inputs- do not directly give rise to high-level functions. Nevertheless, they already implement simple transformations of the information carried by neural activity. Here, we propose that distinct neural circuit functions, such as stimulus representation, working memory or selective attention stem from different combinations and types of low-level manipulations of information, or information processing primitives. To test this hypothesis, we combine approaches from information theory with computational simulations of neural circuits involving interacting brain regions that emulate well-defined cognitive functions (canonic ring models and a large-scale connectome-based model). Specifically, we track the dynamics of information emergent from dynamic patterns of neural activity, using suitable quantitative metrics to detect where and when information is actively buffered ("active information storage"), transferred ("information transfer") or non-linearly merged ("information modification"), as possible modes of low-level processing. We find that neuronal subsets maintaining representations in working memory or performing attention-related gain modulation are signaled by their involvement in operations of information storage or modification, respectively. Thus, information dynamics metrics, beyond detectingwhichnetwork units participate in cognitive processing, also promise to specifyhow and whenthey do it, i.e., through which type of primitive computation, a capability that may be exploited for the analysis of experimental recordings.Significance StatementWe can easily name brain functions and we are well informed about brain structure. However, it is not easy to bridge the gap between the two. Part of the problem is that simple circuit mechanisms do not directly give rise to high-level functions. Yet, they already implement simpler forms of information processing, a sort of "neural assembly language". Here we track such primitive operations of processing using metrics from information theory and benchmarking them on functional simulations. We thus prove that these metrics can reveal the different flavors of information processing involved in different well-defined functions (working memory, selective attention...). We thus transform descriptions of neuronal dynamics into descriptions of how these dynamics specifically propagate and modify information.