2020
DOI: 10.1007/s11227-020-03218-w
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic performance model for web server capacity planning in fog computing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(11 citation statements)
references
References 29 publications
0
11
0
Order By: Relevance
“…Few researchers also investigated the impact on performance using the reactive autoscaling policies and configurations [26] [27]. Some researchers also used fog computing techniques based on to develop a stochastic performance model for capacity planning using Markovian chain model [28]. Few other researchers developed Configuration Recommender tools, based on Support Vector Regression (SVR) and Genetic Algorithms (GA) [30].…”
Section: Related Workmentioning
confidence: 99%
“…Few researchers also investigated the impact on performance using the reactive autoscaling policies and configurations [26] [27]. Some researchers also used fog computing techniques based on to develop a stochastic performance model for capacity planning using Markovian chain model [28]. Few other researchers developed Configuration Recommender tools, based on Support Vector Regression (SVR) and Genetic Algorithms (GA) [30].…”
Section: Related Workmentioning
confidence: 99%
“…Usually, the processes in the computing server are modeled with multiple servers, including homogeneous servers [ 24 , 25 ] and heterogeneous servers [ 29 , 39 ]. An admission manager or load balancer is often modeled as a single server ahead of the computing servers [ 25 , 27 ]. For the task/data arrival, most studies described the task/data flow by Poisson distribution.…”
Section: Related Workmentioning
confidence: 99%
“…For example, Kafhali and Salah [ 24 ] modeled the edge server by M/M/s/K queueing model. Pereira et al [ 25 ] proposed a queueing model for a Fog node containing a load balancer and several web servers to solve a closed-form solution. Liu et al [ 27 ] analyzed the user equipment (UE) side delay and server-side delay, respectively, from a perspective of task offloading and resource allocation.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations