The conditional intensity (CI) of a counting process Yt is based on the minimal knowledge F Y t , i.e., on the observation of Yt alone. Prominently, the mutual information rate of a signal and its Poisson channel output is a difference functional between the CI and the intensity that has full knowledge about the input. While the CI of Markov-modulated Poisson processes evolves according to Snyder's filter, self-exciting processes, e.g., Hawkes processes, specify the CI via the history of Yt. The emergence of the CI as a self-contained stochastic process prompts us to bring its statistical ensemble into focus. We investigate the asymptotic conditional intensity distribution (ACID) and emphasize its rich information content. We assume the case in which the CI is determined from a sufficient statistic that progresses as a Markov process. We present a simulation-free method to compute the ACID when the dimension of the sufficient statistic is low. The method is made possible by introducing a backward recurrence time parametrization, which has the advantage to align all probability inflow in a boundary condition for the master equation. Case studies illustrate the usage of ACID for three primary examples: 1) the Poisson channels with binary Markovian input (as an example of a Markov-modulated Poisson process), 2) the standard Hawkes process with exponential kernel (as an example of a self-exciting counting process) and 3) the Gamma filter (as an example of an approximate filter to a Markov-modulated Poisson process).