Electric Vehicle-assisted Multi-access Edge Computing (EV-MEC) is a promising paradigm where EVs share their computation resources at the network edge to perform intensive computing tasks while charging. In EV-MEC, a fundamental problem is to jointly decide the charging power of EVs and computation task allocation to EVs, for meeting both the diverse charging demands of EVs and stringent performance requirements of heterogeneous tasks. To address this challenge, we propose a new joint charging scheduling and computation offloading scheme (OCEAN) for EV-MEC. Specifically, we formulate a cooperative two-timescale optimization problem to minimize the charging load and its variance subject to the performance requirements of computation tasks. We then decompose this sophisticated optimization problem into two sub-problems: charging scheduling and computation offloading. For the former, we develop a novel safe deep reinforcement learning (DRL) algorithm, and theoretically prove the feasibility of learned charging scheduling policy. For the latter, we reformulate it as an integer non-linear programming problem to derive the optimal offloading decisions. Extensive experimental results demonstrate that OCEAN can achieve similar performances as the optimal strategy and realize up to 24% improvement in charging load variance over three state-of-the-art algorithms while satisfying the charging demands of all EVs.