We propose an approach for learning probability distributions as differentiable quantum circuits (DQC) that enable efficient quantum generative modelling (QGM) and synthetic data generation. Contrary to existing QGM approaches, we perform training of a DQCbased model, where data is encoded in a latent space with a phase feature map, followed by a variational quantum circuit. We then map the trained model to the bit basis using a fixed unitary transformation, coinciding with a quantum Fourier transform circuit in the simplest case. This allows fast sampling from parametrized distributions using a single-shot readout. Importantly, latent space training provides models that are automatically differentiable, and we show how samples from solutions of stochastic differential equations (SDEs) can be accessed by solving stationary and time-dependent Fokker-Planck equations with a quantum protocol. Finally, our approach opens a route to multidimensional generative modelling with qubit registers explicitly correlated via a (fixed) entangling layer. In this case quantum computers can offer advantage as efficient samplers, which perform complex inverse transform sampling enabled by the fundamental laws of quantum mechanics. On a technical side the advances are multiple, as we introduce the phase feature map, analyze its properties, and develop frequency-taming techniques that include qubit-wise training and feature map sparsification.