Dimensionality reduction in large-scale image research plays an important role for their performance in different applications. In this paper, we explore Principal Component Analysis (PCA) as a dimensionality reduction method. For this purpose, first, the Scale Invariant Feature Transform (SIFT) features and Speeded Up Robust Features (SURF) are extracted as image features. Second, the PCA is applied to reduce the dimensions of SIFT and SURF feature descriptors. By comparing multiple sets of experimental data with different image databases, we have concluded that PCA with a reduction in the range, can effectively reduce the computational cost of image features, and maintain the high retrieval performance as well
We propose a dynamic replication strategy that satisfies simultaneously availability and performance tenant requirements while taking into account the tenant budget and the provider profit. The proposed strategy is based on a cost model that aims to calculate the minimum number of replicas required to maintain a high data availability. A replica creation is triggered only when this number of replicas is not reached or when the response time objective is not satisfied. Then, the replication must be profitable for the provider when creating a new replica. Furthermore, data replication and query scheduling are coupled in order to place these replicas in a load balancing way while dealing with the tenant budget. The experiment results prove that the proposed strategy can significantly improve availability and performance while the tenant budget is taken into account.
Because logistic is a process-oriented business, we propose in this paper a measurement system of decision support for assessing the costs associated with each logistics process. This system allows calculating economic, environmental and social costs of logistics process to ensure a sustainable logistics. We have formulated the problem and we present some simulation for testing our system. This proposition allows the decision-maker to have knowledge of economic, ecological and social cost before making a decision.
Abstract. Cloud computing is a new benchmark towards enterprise application development that can facilitate the execution of workflows in business process management system. The workflow technology can manage the business processes efficiently satisfying the requirements of modern enterprises. Besides the scheduling, the fault tolerance is a very important issue in the workflow management. In this paper, we analyse and compare between some existing checkpointing strategies, then we propose a lightweight checkpointing adequate to the cloud computing and the workflows characteristics. The proposed strategy is an Adaptive Time based Coordinated Checkpointing ATCCp, it ensures a strong consistency without any synchronization. ATCCp uses the concept of soft checkpointing to minimize the storage time and it uses the VIOLIN topology to improve the checkpointing performances. According to the experimental results, our approach decreases the overhead and the SLA violations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.