Abstract-This paper presents a critical evaluation of current resource allocation strategies and their possible applicability in Cloud Computing Environment which is expected to gain a prominent profile in the Future Internet. This research attempts to focus towards network awareness and consistent optimization of resource allocation strategies and identifies the issues which need further investigation by the research community. A framework for resource allocation in Cloud Computing, based on tailored active measurements, has also been proposed. The main conclusion is that network topology, traffic considerations, changing optimality criteria along with dynamic user requirements will play a dominant role in determining future Internet application architectures and protocols, shaping resource allocation strategies in Cloud Computing, for example.
Internet traffic at various tiers of service providers is essentially a superposition or active mixture of traffic from various sources. Statistical properties of this superposition and a resulting phenomenon of scaling are important for network performance (queuing), traffic engineering (routing) and network dimensioning (bandwidth provisioning). In this article, the authors study the process of superposition and scaling jointly in a non-asymptotic framework so as to better understand the point process nature of cumulative input traffic process arriving at telecommunication devices (e.g., switches, routers). The authors further assess the scaling dynamics of the structural components (packets, flows and sessions) of the cumulative input process and their relation with superposition of point processes. Classical and new results are discussed with their applicability in access and core networks. The authors propose that renewal theory-based approximate point process models, that is, Pareto renewal process superposition and Weibull renewal process superposition can model the similar second-order scaling, as observed in traffic data of access and backbone core networks, respectively.
Ransomware is an emerging category of malware that locks computer data via powerful cryptographic algorithms. The global propagation of ransomware is a serious threat for individuals and organizations. The banking sector and financial institutions are the prime targets of such ransomware attacks. In case of such an attack, the field of digital forensics helps in estimation of the severity and data loss caused by the attack. Traditional digital forensics investigations make use of static or behavioral analysis to detect malware in infected systems. However, these procedures are challenged by malware obfuscation techniques. Malicious processes can stay inactive and undetected if only a single memory dump is analyzed. Thus, there is a need to collect numerous memory dumps of an individual program that can help with comprehensive and accurate analysis. In this article, we have developed a framework for volatile memory acquisition at regular time intervals to analyze the behavior of individual processes in memory. Through memory forensics, salient features are extracted from the infected memory dumps. These features can be utilized to classify malicious and benign processes efficiently through machine learning as compared to conventional techniques.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.