Mobile user terminals in 5G networks can generate massive computational workloads, which require sufficient computation and caching resources, and the processors of user terminals cannot tackle these workloads. Emerging mobile edge computing (MEC) has become the key to solving the computation problem by offloading computation‐intensive workloads to the MEC server. To make full use of the limited resources on the MEC side, service caching can pre‐store specific executable programs, databases, or libraries for executing offloaded workloads from user terminals. In this study, a service caching‐assisted MEC model is designed. According to the proposed MEC model, a decentralized model‐free deep reinforcement learning algorithm‐based server caching optimization policy (DDSCOP) is proposed to minimize the long‐term weighted average cost. Considering the time‐varying request workloads from user terminals, the stochastic channel state, and the arrival of renewable resources of the MEC server, DDSCOP can find the near‐optimal service caching decision‐making policy by training the neural networks, that is, whether the service caching can be hosted by the MEC server, and which MEC server hosts the service caching. The numerical experimental results verify the convergence and effectiveness of DDSCOP through extensive parameters configuration, and DDSCOP outperforms the three baseline algorithms.