2022
DOI: 10.1002/nem.2213
|View full text |Cite
|
Sign up to set email alerts
|

RNN‐EdgeQL: An auto‐scaling and placement approach for SFC

Abstract: Summary This paper proposes a prediction‐based scaling and placement of service function chains (SFCs) to improve service level agreement (SLA) and reduce operation cost. We used a variant of recurrent neural network (RNN) called gated recurrent unit (GRU) for resource demand prediction. Then, considering these predictions, we built an intuitive scale in/out algorithm. We also developed an algorithm that applies Q‐Learning on Edge computing environment (EdgeQL) to place these scaled‐out VNFs in appropriate loc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 35 publications
0
2
0
Order By: Relevance
“…For example, two words e[0] = "banana" and e[1] = "grapes" have cosine_similarity([e[0]],[e[1]]) as 0.9911089. This makes sense as they are both fruits; e [3] = "Jeff Bezos" and e [4] = "Elon Musk" have cosine_similarity([e [3]], [e [4]]) as 0.98720354 as they are both famous entrepreneurs; comparing banana with Jeff Bezos gives the value 0.84 but it is not as close as 0.99 or 0.98 from examples before.…”
Section: Implementation Detailsmentioning
confidence: 95%
See 1 more Smart Citation
“…For example, two words e[0] = "banana" and e[1] = "grapes" have cosine_similarity([e[0]],[e[1]]) as 0.9911089. This makes sense as they are both fruits; e [3] = "Jeff Bezos" and e [4] = "Elon Musk" have cosine_similarity([e [3]], [e [4]]) as 0.98720354 as they are both famous entrepreneurs; comparing banana with Jeff Bezos gives the value 0.84 but it is not as close as 0.99 or 0.98 from examples before.…”
Section: Implementation Detailsmentioning
confidence: 95%
“…Several Text Classification models have been proved to be effective, for example Recurrent Neural Networks (RNN) and Convolutional Neural Networks (CNN). Although they have been proven to be effective in many tasks [2][3][4][5], certain problems exists [1]. For instance, it is hard for long sentence to be learned well by the model due to the problem called vanishing gradient.…”
Section: Introductionmentioning
confidence: 99%