InformationWhat is information? A short answer could be: Anything that changes our minds and states of thinking! Information is virtual. It can be carried by speech, an image, scratches on a stone, patterns in a photon field, the connections of neurons in our brains or even by the wavefunction of an electron. Information does not depend on the actual means of transmission nor does its identity change when changing the carrier. Information is physical. It needs to be sustained by a physical substrate and rearranging that substrate to contain the wanted information requires to spend energy or work. Without a physical world, information can not be stored, processed, or transmitted. Our world is permeated with different representations of information. Actually Information about the physical laws governing our Universe can be found everywhere and in everything.Could it then be that the real fundamental elements of this world are tiny bits of information? Could we get it from bit? [1] A number of researcher investigate this possibility, and several of their attempts to recover physical laws from the principles governing information are reported about in this special issue on physics of information. The articles by Gregg Jaeger, [2] by Ariel Caticha, [3] and by John Skilling and Kevin Knuth [4] address the reconstruction of quantum mechanics from information theory. Furthermore, Kevin Knuth and James Walsh [5] show how to recover space-time from particles sending each others signals in an initially geometry-free setting.A central element of most works on information is the concept of a probability. Probabilities can be defined via various routes. Most familiar might be their definitions by the frequentists school as relative frequencies of events (in the limit of infinite amounts of such). A more general definition of probabilities is the Bayesians' one, which regards probabilities as quantifiers of statements, expressing how much those should be expected to be true. Although this definition looks subjective on the first glance, it rests on a solid mathematical theorem by Richard Cox [6] that any extension of binary logic to a continuum of single valued truthfulness quantifier that respects a minimum of requirements must be isomorphic to probability theory. And this definition turns out to embrace the frequentists' definition in case that infinite numbers of comparable events are actually available and can be counted. The majority of authors of this special issues seem to favor the Bayesian perspective, which is probably a result of our own, the guest editors', preference and not necessarily a reflection of the prevailing position of contemporary scientists.The different views on probabilities of frequentists and Bayesians are addressed by the article of Allen Caldwell. [7] He builds a bridge between these antagonistic camps by explaining frequentists' constructions in a Bayesian language. In particular, his argumentation why the omnipresent p-value in statistical and experimental literature might often capture a releva...