The purpose of this paper is to estimate the intensity of some random measure N on a set X by a piecewise constant function on a finite partition of X . Given a (possibly large) family M of candidate partitions, we build a piecewise constant estimator (histogram) on each of them and then use the data to select one estimator in the family. Choosing the square of a Hellinger-type distance as our loss function, we show that each estimator built on a given partition satisfies an analogue of the classical squared bias plus variance risk bound. Moreover, the selection procedure leads to a final estimator satisfying some oracle-type inequality, with, as usual, a possible loss corresponding to the complexity of the family M. When this complexity is not too high, the selected estimator has a risk bounded, up to a universal constant, by the smallest risk bound obtained for the estimators in the family. For suitable choices of the family of partitions, we deduce uniform risk bounds over various classes of intensities. Our approach applies to the estimation of the intensity of an inhomogenous Poisson process, among other counting processes, or the estimation of the mean of a random vector with nonnegative components.