We study the classical problem of characterizing the channel capacity and its achieving distribution in a generic fashion. We derive a simple relation between three parameters: the input-output function, the input cost function and the noise probability density function, one which dictates the type of the optimal input. In Layman terms we prove that the support of the optimal input is bounded whenever the cost grows faster than a "cut-off" rate equal to the logarithm of the noise PDF evaluated at the input-output function. Furthermore, we prove a converse statement that says whenever the cost grows slower than the "cut-off" rate, the optimal input has necessarily an unbounded support. In addition, we show how the discreteness of the optimal input is guaranteed whenever the triplet satisfy some analyticity properties. We argue that a suitable cost function to be imposed on the channel input is one that grows similarly to the "cut-off" rate.Our results are valid for any cost function that is super-logarithmic. They summarize a large number of previous channel capacity results and give new ones for a wide range of communication channel models, such as Gaussian mixtures, generalized-Gaussians and heavy-tailed noise models, that we state along with numerical computations.
Many communication channels are reasonably modeled to be impaired by additive noise. Recent studies suggest that many of these channels are affected by additive noise that is best explained by alpha-stable statistics.We study in this work such channel models and we characterize the capacity-achieving input distribution for those channels under fractional order moment constraints. We prove that the optimal input is necessarily discrete with a compact support for all such channels.Interestingly, if the second moment is viewed as a measure of power, even when the channel input is allowed to have infinite second moment, the optimal one is found to have finite power.
Evaluating the channel capacity is one of many key problems in information
theory. In this work we derive rather-mild sufficient conditions under which
the capacity is finite and achievable. These conditions are derived for
generic, memoryless and possibly non-linear additive noise channels. The
results are based on a novel sufficient condition that guarantees the
convergence of differential entropies under point-wise convergence of
Probability Density Functions. Perhaps surprisingly, the finiteness of channel
capacity holds for the majority of setups, including those where inputs and
outputs have possibly infinite second-moments.Comment: Submitted to the IEEE Transactions on Communication
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.