The standard practice in modeling dynamics and optimal control of a large population, ensemble, multi-agent system represented by it's continuum density, is to model individual decision making using local feedback information. In comparison to a closed-loop optimal control scheme, an open-loop strategy, in which a centralized controller broadcasts identical control signals to the ensemble of agents, mitigates the computational and infrastructure requirements for such systems. This work considers the open-loop, deterministic and optimal control synthesis for the density control of agents governed by marked jump diffusion stochastic diffusion equations. The density evolves according to a forward-intime Chapman-Kolmogorov partial integro-differential equation and the necessary optimality conditions are obtained using the infinite dimensional minimum principle (IDMP). We establish the relationship between the IDMP and the dynamic programming principle as well as the IDMP and stochastic dynamic programming for the synthesized controller. Using the linear Feynman-Kac lemma, a sampling-based algorithm to compute the control is presented and demonstrated for agent dynamics with non-affine and nonlinear drift as well as noise terms.