With the emergence of delay-sensitive task completion, computational offloading becomes increasingly desirable due to the end-user's limitations in performing computation-intense applications. Interestingly, fog computing enables computational offloading for the end-users towards delay-sensitive task provisioning. In this paper, we study the computational offloading for the multiple tasks with various delay requirements for the end-users, initiated one task at a time in end-user side. In our scenario, the end-user offloads the task data to its primary fog node. However, due to the limited computing resources in fog nodes compared to the remote cloud server, it becomes a challenging issue to entirely process the task data at the primary fog node within the delay deadline imposed by the applications initialized by the end-users. In fact, the primary fog node is mainly responsible for deciding the amount of task data to be offloaded to the secondary fog node and/or remote cloud. Moreover, the computational resource allocation in term of CPU cycles to process each bit of the task data at fog node and transmission resource allocation between a fog node to the remote cloud are also important factors to be considered. We have formulated the above problem as a Quadratically Constraint Quadratic Programming (QCQP) and provided a solution. Our extensive simulation results demonstrate the effectiveness of the proposed offloading scheme under different delay deadlines and traffic intensity levels. INDEX TERMS 5G and beyond, computation offloading, mobile edge computing, fog computing, resource allocation, offloading decision. I. INTRODUCTION With the emergence of ultra-reliable and low-latency communications (uRLLC) [1]-[4], the latency and reliability-aware mission-critical applications are increasingly growing up. To mention, a few examples are, autonomous driving, virtual and augmented reality, and cloud robotics, remote surgery, and factory automation. However, at the same time, the enduser's computational resources limit the user's experience (e.g., latency and reliability) for the computational-intensive applications. The cloud computing has already proven its significance to process the computational-intensive tasks, however, the physical distance between the end-user and The associate editor coordinating the review of this manuscript and approving it for publication was Christos Verikoukis. remote cloud data center and burden on fronthaul link are the major barrier for low-latency-aware applications. To address the above challenges, Fog computing [5], [6], often viewed as a middleware between end-user and cloud, extends the computational, communication, and storage resources of the cloud computing close to the network edge. A. MOTIVATION For computational-intensive task processing in a fog computing scenario, the end-user offloads the data either partially or entirely to the nearby fog computing node(s). It would be an ideal solution if a single fog computing node (hereinafter referred to as fog nodes) is able to com...