According to Kolmogorov complexity, every finite binary string is compressible to a shortest code -its information content -from which it is effectively recoverable. We investigate the extent to which this holds for infinite binary sequences (streams). We devise a new coding method which uniformly codes every stream X into an algorithmically random stream Y, in such a way that the first n bits of X are recoverable from the first I(X ↾ n ) bits of Y, where I is any partial computable information content measure which is defined on all prefixes of X, and where X ↾ n is the initial segment of X of length n. As a consequence, if g is any computable upper bound on the initial segment prefix-free complexity of X, then X is computable from an algorithmically random Y with oracle-use at most g. Alternatively (making no use of such a computable bound g) one can achieve an oracle-use bounded above by K(X ↾ n ) + log n. This provides a strong analogue of Shannon's source coding theorem for algorithmic information theory.A fruitful way to quantify the complexity of a finite object such as a string σ in a finite alphabet 1 is to consider the length of the shortest binary program which prints σ. This fundamental idea gives rise to a theory of algorithmic information and compression, which is based on the theory of computation and was pioneered by Kolmogorov [21] and Solomonoff [41]. The Kolmogorov complexity of a binary string σ is the length of the shortest program that outputs σ with respect to a fixed universal Turing machine. The use of prefix-free machines in the definition of Kolmogorov complexity was pioneered by Levin [26] and Chaitin [10], and allowed for the development of a robust theory of incompressibility and algorithmic randomness for streams (i.e. infinite binary sequences). Information content measures, defined by Chaitin [11] after Levin [26], are functions that assign a positive integer value to each binary string, representing the amount of information contained in the string.Definition 1.1 (Information content measure). A partial function I from strings to N is an information content measure if it is right-c.e. 2 and I(σ)↓ 2 −I(σ) is finite.Prefix-free Kolmogorov complexity can be characterized as the minimum (modulo an additive constant) information content measure. If K(σ) denotes the prefix-free Kolmogorov complexity of the string σ, then for c ∈ N we say that σ is c-incompressible if K(σ) ≥ |σ| − c. It is a basic fact concerning Kolmogorov complexity that for some universal constant c: every string σ has a shortest code σ * which is itself c-incompressible.(1)Our goal is to investigate the extent to which the above fact holds in an infinite setting, i.e. for streams instead of strings. In the context of Kolmogorov complexity, algorithmic randomness is defined as incompressibility. 3 So (1) can be read as follows: we can uniformly code each string σ into an algorithmically random string of length K(σ). In order to formalise an infinitary analogue of this statement, we need to make use of oracle-machine ...