The Internet of Things (IoT), real-time media streaming, and other related technologies have increased due to the rapid development of wireless communication technologies and the enormous growth of computation, storage, and data transmission tasks. Edge-Cloud Computing (ECC) is a technology that can be used to better meet the diverse needs of IoT users, it combines the benefits of Mobile Cloud Computing (MCC) and Mobile Edge Computing (MEC) to meet energy consumption and delay requirements, and achieve more stable and affordable task execution. The most significant challenge in ECC is making realtime task offloading decisions. In order to generate offloading decisions in ECC environments in an efficient and near optimal manner, a Deep Reinforcement Learning (DRL)-based Distributed task Offloading (DRL-DO) framework is proposed. Simulation results demonstrate the accuracy of the DRL-DO framework, it achieves high Gain Ratio (GR) and greatly reduces the energy consumption, response time, while attaining moderate time cost compared with other offloading algorithms.