Reducing energy consumption in a MEC (Multi-Access Edge Computing) system is a critical goal, both for lowering operational expenses and promoting environmental sustainability. In this paper, we focus on the problem of managing the sleep state of MEC servers (MECSs) to decrease the overall energy consumption of a MEC system while providing users acceptable service delays. The proposed method achieves this objective through dynamic orchestration of MECS activation states based on systematic analysis of workload distribution patterns. To facilitate this optimization, we formulate the MECS sleep control mechanism as a constrained combinatorial optimization problem. To resolve the formulated problem, we take a deep-learning approach. We develop a task arrival rate predictor using a spatio-temporal graph convolution network (STGCN). We then integrate this predicted information with the queue length distribution to form the input state for our deep reinforcement learning (DRL) agent. To verify the effectiveness of our proposed framework, we conduct comprehensive simulation studies incorporating real-world operational datasets, with comparative evaluation against established metaheuristic optimization techniques. The results indicate that our method demonstrates robust performance in MECS state optimization, maintaining operational efficiency despite prediction uncertainties. Accordingly, the proposed approach yields substantial improvements in system performance metrics, including enhanced energy utilization efficiency, decreased service delay violation rate, and reduced computational latency in operational state determination.