An energy hub (EH) provides an effective solution to the management of local integrated energy systems (IES), supporting the optimal dispatch and mutual conversion of distributed energy resources (DER) in multi-energy forms. However, the intrinsic stochasticity of renewable generation intensifies fluctuations in the system’s energy production when integrated into large-scale grids and increases peak-to-valley differences in large-scale grid integration, leading to a significant reduction in the stability of the power grid. A distributed privacy-preserving energy scheduling method based on multi-agent deep reinforcement learning is presented for the EH cluster with renewable energy generation. Firstly, each EH is treated as an agent, transforming the energy scheduling problem into a Markov decision process. Secondly, the objective function is defined as minimizing the total economic cost while considering carbon trading costs, guiding the agents to make low-carbon decisions. Lastly, differential privacy protection is applied to sensitive data within the EH, where noise is introduced using energy storage systems to maintain the same gas and electricity purchases while blurring the original data. The experimental simulation results demonstrate that the agents are able to train and learn from environmental information, generating real-time optimized strategies to effectively handle the uncertainty of renewable energy. Furthermore, after the noise injection, the validity of the original data is compromised while ensuring the protection of sensitive information.