Workflow scheduling is crucial to the efficient operation of cloud platforms, and has attracted a lot of attention. Up to now, many algorithms have been reported to schedule workflows with budget constraints, so as to optimize workflows' makespan on cloud resources. Nevertheless, the hourly-based billing model in cloud computing is an ongoing challenge for workflow scheduling that easily results in higher makespan or even infeasible solutions. Besides, due to data constraints among workflow tasks, there must be a lot of idle slots in cloud resources. Few works adequately exploit these idle slots to duplicate tasks' predecessors to shorten their completion time, thereby minimizing workflow's makespan while ensuring its budget constraint. Motivated by these, we propose a task duplication based scheduling algorithm, namely TDSA, to optimize makespan for budget-constrained workflows in cloud platforms. In TDSA, two novel mechanisms are devised: 1) a dynamic sub-budget allocation mechanism, it is responsible for recovering unused budget of scheduled workflow tasks and redistributing remaining budget, which is conducive to using more expensive/powerful cloud resources to accelerate completion time of unscheduled tasks; and 2) a duplication-based task scheduling mechanism, which strives to exploit idle slots on resources to selectively duplicate tasks' predecessors, such advancing these tasks' completion time while trying to ensuring their sub-budget constraints. At last, we carry out four groups of experiments, three groups on randomly generated workflows and another one on actual workflows, to compare the proposed TDSA with four baseline algorithms. Experimental results confirm that the TDSA has an overwhelming superiority in advancing the workflows' makespan and improving the utilization of cloud computing resources. INDEX TERMS Cloud computing, task duplication, workflow scheduling, resource provision, heuristic mechanism.
Accurate recognition of fruits in the orchard is an important step for robot picking in the natural environment, since many CNN models have a low recognition rate when dealing with irregularly shaped and very dense fruits, such as a grape bunch. It is a new trend to use a transformer structure and apply it to a computer vision domain for image processing. This paper provides Swin Transformer and DETR models to achieve grape bunch detection. Additionally, they are compared with traditional CNN models, such as Faster-RCNN, SSD, and YOLO. In addition, the optimal number of stages for a Swin Transformer through experiments is selected. Furthermore, the latest YOLOX model is also used to make a comparison with the Swin Transformer, and the experimental results show that YOLOX has higher accuracy and better detection effect. The above models are trained under red grape datasets collected under natural light. In addition, the dataset is expanded through image data augmentation to achieve a better training effect. After 200 epochs of training, SwinGD obtained an exciting mAP value of 94% when IoU = 0.5. In case of overexposure, overdarkness, and occlusion, SwinGD can recognize more accurately and robustly compared with other models. At the same time, SwinGD still has a better effect when dealing with dense grape bunches. Furthermore, 100 pictures of grapes containing 655 grape bunches are downloaded from Baidu pictures to detect the effect. The Swin Transformer has an accuracy of 91.5%. In order to verify the universality of SwinGD, we conducted a test under green grape images. The experimental results show that SwinGD has a good effect in practical application. The success of SwinGD provides a new solution for precision harvesting in agriculture.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.