In this paper, we propose a sliding-mode-based stochastic distribution control algorithm for nonlinear systems, where the sliding-mode controller is designed to stabilize the stochastic system and stochastic distribution control tries to shape the sliding surface as close as possible to the desired probability density function. Kullback-Leibler divergence is introduced to the stochastic distribution control, and the parameter of the stochastic distribution controller is updated at each sample interval rather than using a batch mode. It is shown that the estimated weight vector will converge to its ideal value and the system will be asymptotically stable under the rank-condition, which is much weaker than the persistent excitation condition. The effectiveness of the proposed algorithm is illustrated by simulation.