Entropy is useful in statistical problems as a measure of irreversibility, randomness, mixing, dispersion, and number of microstates. However, there remains ambiguity over the precise mathematical formulation of entropy, generalized beyond the additive definition pioneered by Boltzmann, Gibbs, and Shannon (applicable to thermodynamic equilibria). For generalized entropies to be applied rigorously to nonequilibrium statistical mechanics, we suggest that there is a need for a physically interpretable (dimensional) framework that can be connected to dynamical processes operating in phase space. In this work, we introduce dimensional measures of entropy that admit arbitrary invertible weight functions (subject to curvature and convergence requirements). These ``dimensional entropies'' have physical dimensions of phase-space volume and represent the extent of level sets of the distribution function. Dimensional entropies with power-law weight functions (related to Renyi and Tsallis entropies) are particularly robust, as they do not require any internal dimensional parameters due to their scale invariance. We also point out the existence of composite entropy measures that can be constructed from functionals of dimensional entropies. We calculate the response of the dimensional entropies to perturbations, showing that for a structured distribution, perturbations have the largest impact on entropies weighted at a similar phase-space scale. This elucidates the link between dynamics (perturbations) and statistics (entropies). Finally, we derive corresponding generalized maximum-entropy distributions. Dimensional entropies may be useful as a diagnostic (for irreversibility) and for theoretical modeling (if the underlying irreversible processes in phase space are understood) in chaotic and complex systems, such as collisionless systems of particles with long-range interactions.