Information is a central notion for cognitive sciences and neurosciences, but there is no agreement on what it means for a cognitive system to acquire information about its surroundings. In this paper, we approximate three influential views on information: the one at play in ecological psychology, which is sometimes called information for action; the notion of information as covariance as developed by some enactivists, and the idea of information as a minimization of uncertainty as presented by Shannon. Our main thesis is that information for action can be construed as covariant information, and that learning to perceive covariant information is a matter of minimizing uncertainty through skilled performance. We argue that the agent's cognitive system conveys information for acting in an environment by minimizing uncertainty about how to achieve intended goals in that environment. We conclude by reviewing empirical findings that support our view by showing how direct learning, seen as an instance of ecological rationality at work, is how mere possibilities for action are turned into embodied know-how. Finally, we indicate the affinity between direct learning and sense-making activity.Information is the bread and butter of cognitive science and neuroscience (CSN). Talk about information processing, control, storage, and retrieval is abundant in explanations of how cognitive systems can perform specific tasks and enable agents to interact intelligently with their environment. Accordingly, one of the defining tasks of CSN is to describe the mechanisms through which information is conveyed, an enterprise that, if successful, allows us to understand, predict, simulate, and intervene upon the cognitive capacities of real agents.The groundwork of the way information is understood by CSN today was laid by Shannon's (1948) mathematical account of information, which made possible nothing less than digital communication. Simply put, Shannon's theory defines information as entropy, which is the measure of average uncertainty of the selection of an encoded signal. The core idea of what became known as Shannon-information is that the less uncertain the selection of the encoded signal is at its receiver, the more information the signal carries from its sender. Noise, on the other hand, permanently corrupts the signal, thus increasing entropy and diminishing information. To summarize, information is a matter of minimization of uncertainty. Thus, CSN requires the Frontiers in Psychology | www.frontiersin.org 1