In recent times, large applications that need near real-time processing are increasingly being used on devices with limited resources. Multi access edge computing is a computing paradigm that provides a solution to this problem by placing servers as close to resource constrained devices as possible. However, the edge device must consider multiple conflicting objectives, viz., energy consumption, latency, task drop rate and quality of experience. Many previous approaches optimize on only one objective or a fixed linear combination of multiple objectives. These approaches don't ensure best performance for applications that run on edge servers, as there is no guarantee that the solution obtained by these approaches lies on the paretofront. In this work, Multi Objective Reinforcement Learning with Actor-Critic model is proposed to optimize the drop rate, latency and energy consumption parameters during offloading decision. The model is compared with MORL-Tabular, MORL-Deep Q Network and MORL-Double Deep Q Network models. The proposed model outperforms all the other models in terms of drop rate and latency.