Cloud computing is a significant platform emerging for IT enterprises, business applications, and mobile computing. It is a dynamic environment where efficiently and properly allocating resources in such an environment becomes a tedious task. Most existing approaches cannot guarantee energy-efficient resource management due to the existence of time-dependent tasks. Cloud resources consume a huge amount of energy, which may reduce the Makespan of the entire network. The process of appropriate task scheduling may satisfy the user's requirement. Reducing energy consumption by satisfying the user's QoS requirements is essential to ensure each user's service level agreement (SLA). Therefore, to achieve an energy-efficient and unique task scheduling, a technique for order preference by similarity to ideal solution (TOPSIS) based unique ranking (uRank-TOPSIS) and a Hybrid state-action-reward-state-action (SARSA), Reinforcement Learning with Black widow Algorithm (HSRLBA) are proposed in this paper. Here, the task scheduling process is carried out using the uRank-TOPSIS process by producing a set of unique weights and ranking the alternatives. Also, HSRLBA is used to perform resource allocation, which uses the Black widow Algorithm (BWA) to rapidly converge parallel agents in the SARSA model of RL. The Cloudsim simulator is used for experiments. The efficiency of the proposed framework is measured in terms of Makespan, Task Completion Ratio (TCR), total energy consumption, response time and resource utilization. The proposed uRank-TOPSIS and HSRLBA achieved an energy consumption of 325 KWh, response time of 15.42 s, Makespan of the 1150 s, TCR of 98% and resource utilization of 92%. The proposed framework maximizes TCR and resource utilization and minimizes the Makespan, energy consumption, and response time.