In this correspondence, we illustrate among other things the use of the stationarity property of the set of capacityachieving inputs in capacity calculations. In particular, as a case study, we consider a bit-patterned media recording channel model and formulate new lower and upper bounds on its capacity that yield improvements over existing results. Inspired by the observation that the new bounds are tight at low noise levels, we also characterize the capacity of this model as a series expansion in the low-noise regime.The key to these results is the realization of stationarity in the supremizing input set in the capacity formula. While the property is prevalent in capacity formulations in the ergodictheoretic literature, we show that this realization is possible in the Shannon-theoretic framework where a channel is defined as a sequence of finite-dimensional conditional probabilities, by defining a new class of consistent stationary and ergodic channels.Index Terms-Channel capacity, stationary inputs, stationary and ergodic channel, bit-symmetry, bit-patterned media recording, lower/upper bounds, series expansion.
I. BACKGROUNDThe fundamental limit of information transmission through noisy channels, the channel capacity, has been a holy grail in information theory. The capacity problem of a general pointto-point channel has been well resolved with the informationspectrum framework [1]. Such general formula for the capacity, however, does not lend itself to computation in general, since it requires one to scrutinize the distribution of the information density at the limit of infinite block length. To overcome this problem, a common approach is to find an alternative expression that, instead of being described by an information-spectrum quantity, contains a mutual information quantity (or entropy quantities). While an expression of this kind is not as general, it may cover a sufficiently large class of channels for many practical purposes.There are two popular forms of such expression. One is Dobrushin's information-stable channel capacity [2]:where the supremum is over all possible sequences of distributions P (n) : X n ∼ P (n) . This formula holds for the class of information-stable channels. A similar formula also appears in the context of (decomposable or indecomposable) finitestate channels [3]. The other form swaps the supremum and the limit in the above formula, with the supremum being taken over a smaller set of input distributions with special structures. This type of formula appears in the ergodic-theoretic literature of information theory. For example, ford-continuous discrete stationary and ergodic (SE) two-sided channels, the capacity was shown to be [4]where µ is a probability measure that describes the input process. (See Section I-B for the distinction between the two sets being supremized over in the above formulas.) Such capacity formulations that involve supremization over stationary inputs are common in the ergodic-theoretic setting, where a channel only admits infinitely long input sequences,...