We consider time-continuous Markovian discrete-state dynamics on random networks of interacting agents and study the large population limit. The dynamics are projected onto low-dimensional collective variables given by the shares of each discrete state in the system, or in certain subsystems, and general conditions for the convergence of the collective variable dynamics to a mean-field ordinary differential equation are proved. We discuss the convergence to this mean-field limit for a continuous-time noisy version of the so-called "voter model" on Erdős-Rényi random graphs, on the stochastic block model, as well as on random regular graphs. Moreover, a heterogeneous population of agents is studied. For each of these types of interaction networks, we specify the convergence conditions in dependency on the corresponding model parameters.