Workflow scheduling is a key problem to be solved in the cloud to increases the quality of services. Few research works have been designed for performing workflow scheduling using different techniques. But, scheduling performance of existing techniques was not effective when considering a larger number of user tasks. Besides, the makespan of workflow scheduling was higher. In order to solve such limitations, Gene Optimized Deep Neural Round Robin Scheduling (GODNRRS) Technique is proposed. The designed GODNRRS Technique contains three layers namely input, hidden and output layer to efficiently perform workflow scheduling in the cloud. The GODNRRS Technique initially gets the number of user tasks as input in the input layer and forwards it to the hidden layer. After taking input, GODNRRS Technique initializes gene population with the assist of virtual machines in Amazon cloud server at the first hidden layer. Next, GODNRRS Technique determines fitness function for each virtual machine using their energy, memory, CPU time, bandwidth capacity at the second hidden layer. Afterward, GODNRRS Technique defines a weight for each virtual machine at the third hidden layer depends on their fitness function estimation. Consequently, GODNRRS Technique distributes the user tasks to optimal virtual machines according to their weight value at the fourth hidden layer in a cyclic manner. At last, the output layer renders the scheduled tasks result. Thus, GODNRRS Technique handles workflows in the cloud with improved scheduling efficiency and lower energy and makespan. The GODNRRS Technique conduct the experimental evaluation using metrics such as scheduling efficiency, makespan, and energy consumption with respect to a different number of user tasks from LIGO , Montage and cybershake real-time applications. The experimental result show that the GODNRRS Technique is able to increases the efficiency and also reduces the makespan of workflows scheduling in the cloud as compared to state-of-the-art works. 1000 88 89 92 99 85 89 90 95 87 90 91 98