As global energy demand rises and climate change poses an increasing threat, the development of sustainable, low-carbon energy solutions has become imperative. This study focuses on optimizing shared energy storage (SES) and distribution networks (DNs) using deep reinforcement learning (DRL) techniques to enhance operation and decision-making capability. An innovative dynamic carbon intensity calculation method is proposed, which more accurately calculates indirect carbon emissions of the power system through network topology in both spatial and temporal dimensions, thereby refining carbon responsibility allocation on the user side. Additionally, we integrate user-side SES and ladder-type carbon emission pricing into DN to create a low-carbon economic dispatch model. By framing the problem as a Markov decision process (MDP), we employ the DRL, specifically the deep deterministic policy gradient (DDPG) algorithm, enhanced with prioritized experience replay (PER) and orthogonal regularization (OR), to achieve both economic efficiency and environmental sustainability. The simulation results indicate that this method significantly reduces the operating costs and carbon emissions of DN. This study offers an innovative perspective on the synergistic optimization of SES with DN and provides a practical methodology for low-carbon economic dispatch in power systems.