Motivated from the fact that universal source coding on countably infinite alphabets is not feasible, this work introduces the notion of "almost lossless source coding". Analog to the weak variable-length source coding problem studied by Han [3], almost lossless source coding aims at relaxing the lossless block-wise assumption to allow an average per-letter distortion that vanishes asymptotically as the blocklength goes to infinity. In this setup, we show on one hand that Shannon entropy characterizes the minimum achievable rate (similarly to the case of discrete sources) while on the other that almost lossless universal source coding becomes feasible for the family of finite-entropy stationary memoryless sources with countably infinite alphabets. Furthermore, we study a stronger notion of almost lossless universality that demands uniform convergence of the average per-letter distortion to zero, where we establish a necessary and sufficient condition for the so-called family of "envelope distributions" to achieve it. Remarkably, this condition is the same necessary and sufficient condition needed for the existence of a strongly minimax (lossless) universal source code for the family of envelope distributions.Finally, we show that an almost lossless coding scheme offers faster rate of convergence for the (minimax) redundancy compared to the well-known information radius developed for the lossless case at the expense of tolerating a non-zero distortion that vanishes to zero as the block-length grows. This shows that even when lossless universality is feasible, an almost lossless scheme can offer different regimes on the rates of convergence of the (worst case) redundancy versus the (worst case) distortion.The material in this paper was partially published in The IEEE 2016 [1] and IEEE 2017 [2] International Symposium on Information Theory (ISIT).
J. F. Silva is with the Information and DecisionUniversal source coding, countably infinite alphabets, weak source coding, envelope distributions, information radius, metric entropy analysis.