Integrating wireless power transfer with mobile edge computing (MEC) has become a powerful solution for increasingly complicated and dynamic industrial Internet of Things (IIOT) systems. However, the traditional approaches overlooked the heterogeneity of the tasks and the dynamic arrival of energy in wirelessly powered MEC-enabled IIOT systems. In this paper, we formulate the problem of maximizing the product of the computing rate and the task execution success rate for heterogeneous tasks. To manage energy harvesting adaptively and select appropriate computing modes, we devise an online resource allocation and computation offloading approach based on deep reinforcement learning. We decompose this approach into two stages: an offloading decision stage and a stopping decision stage. The purpose of the offloading decision stage is to select the computing mode and dynamically allocate the computation round length for each task after learning from the channel state information and the task experience. This stage allows the system to support heterogeneous computing tasks. Subsequently, in the second stage, we adaptively adjust the number of fading slots devoted to energy harvesting in each round in accordance with the status of each fading slot. Simulation results show that our proposed algorithm can better allocate resources for heterogeneous tasks and reduce the ratio of failed tasks and energy consumption when compared with several existing algorithms.