2021
DOI: 10.1002/spe.3016
|View full text |Cite
|
Sign up to set email alerts
|

Container lifecycle‐aware scheduling for serverless computing

Abstract: Elastic scaling in response to changes on demand is a main benefit of serverless computing. When bursty workloads arrive, a serverless platform launches many new containers and initializes function environments (known as cold starts), which incurs significant startup latency. To reduce cold starts, platforms usually pause a container after it serves a request, and reuse this container for subsequent requests. However, this reuse strategy cannot efficiently reduce cold starts because the schedulers are agnostic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 22 publications
(3 citation statements)
references
References 14 publications
0
3
0
Order By: Relevance
“…Containerization: The most common way to package and isolate the function is through containerization. 52 In this technology, a function is encapsulated within a widely-accepted container standard, termed Open Container Initiative (OCI) format, 53 which is supported in all modern containerization solutions. Any programming language and/or software dependency can be supported, thus, the desired generality of serverless is accomplished.…”
Section: Function Isolation In Serverless Computingmentioning
confidence: 99%
“…Containerization: The most common way to package and isolate the function is through containerization. 52 In this technology, a function is encapsulated within a widely-accepted container standard, termed Open Container Initiative (OCI) format, 53 which is supported in all modern containerization solutions. Any programming language and/or software dependency can be supported, thus, the desired generality of serverless is accomplished.…”
Section: Function Isolation In Serverless Computingmentioning
confidence: 99%
“…The authors also propose a two-level resource management mechanism, where the higher level entirely leverages the Kubernetes scheduler, whereas the lower level is speciic to function dispatching. [115] presents the concept of łlifecycle-aware schedulingž, where the main idea is to prevent the eviction of containers that may be needed soon and to favor existing containers that will become available sooner than containers that have to be created anew. While this contribution is not directly based on Kubernetes, it leverages the open source serverless platform OpenWhisk [7], which accepts Kubernetes as one of its deployment options.…”
Section: Summary and Identified Gapsmentioning
confidence: 99%
“…In the context of virtualization, Wu et al 45 proposed a contained lifecycle‐aware scheduling algorithm for serverless computing, where each computing request is processed in a container with a specified resource requirement in terms of CPU and memory. Other mechanisms have been proposed by Fan et al 46 in the context of cloud computing.…”
Section: Related Workmentioning
confidence: 99%