In Wide Area Networks (WANs), optimal resource allocation is crucial for enhancing computational efficiency, particularly in Low Power Wide Area Networks (LPWANs). This work introduces a machine learning-based system to optimize data transfer rates while minimizing power consumption in LPWANs. The focus is on LoRa, a notable LPWAN technology for long-range communication and interference resilience. Existing LoRa networks experience performance degradation due to interference and congestion caused by the Internet of Things (IoT). To address this, advanced Spreading Factor (SF) allocation techniques are employed, using a metaheuristic optimization algorithm (Particle Swarm Optimization) and an ensemble machine learning algorithm based on gradient boosting (XGBoost), alongside Decision Tree Classifier (DTC) and Random Forest (RF). Simulation results reveal that these approaches significantly enhance Packet Delivery Ratio (PDR) and reduce transmit energy consumption across various distances, outperforming traditional SF schemes. The RF method, for instance, achieves up to 6.32% higher PDR and reduces energy consumption by up to 16.67% compared to the Lowest SF method. Additionally, these techniques improve throughput by up to 14.9% over classical methods. The study also examines the effects of gateways, network distance, and SF on PDR and energy utilization, demonstrating that the proposed methods adapt effectively to different network conditions. The findings highlight the potential of these advanced methods to enhance LoRa network performance, making them suitable for large-scale IoT deployments.