With the rapid growth of data centers, optimizing energy consumption has become a critical challenge. This paper proposes an energy management framework that integrates Long Short-Term Memory (LSTM) networks for load prediction with a dynamic resource allocation algorithm to optimize energy consumption in data centers. The LSTM model is employed to accurately predict future workloads based on historical data, enabling the proactive adjustment of resource utilization. The dynamic resource allocation algorithm, informed by LSTM-based predictions, dynamically manages server operations and cooling systems through techniques such as dynamic voltage and frequency scaling (DVFS) and server consolidation, ensuring that resources are allocated efficiently in real time. The proposed framework incorporates a multi-objective optimization approach to balance energy savings and system performance. We use genetic algorithms to fine-tune resource allocation based on the predicted load, optimizing for energy consumption and response time. Experimental results show that our method achieves significant energy savings compared to traditional static resource management approaches, with minimal impact on service quality. This research highlights the potential of machine learning-driven optimization algorithms in reducing the environmental footprint of data centers while maintaining operational efficiency.