2021 IEEE International Conference on Cluster Computing (CLUSTER) 2021
DOI: 10.1109/cluster48925.2021.00096
|View full text |Cite
|
Sign up to set email alerts
|

The Case for Storage Optimization Decoupling in Deep Learning Frameworks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 32 publications
0
4
0
Order By: Relevance
“…Results of each experiment concern the average and standard deviation of 7 runs. For the RestNet-50, all setups perform similarly, ranging from 64 and 67 minutes of execution time, as it imposes less I/O demand [15]. Interestingly, the Lustre setup exhibits the highest training time variability across identical runs of each experiment.…”
Section: A DL Training Under Different Storage Setupsmentioning
confidence: 96%
See 2 more Smart Citations
“…Results of each experiment concern the average and standard deviation of 7 runs. For the RestNet-50, all setups perform similarly, ranging from 64 and 67 minutes of execution time, as it imposes less I/O demand [15]. Interestingly, the Lustre setup exhibits the highest training time variability across identical runs of each experiment.…”
Section: A DL Training Under Different Storage Setupsmentioning
confidence: 96%
“…CoorDL [17] provides insights on storage I/O data stalls and mitigates them by providing a new in-memory caching policy. PRISMA [15] proposes a Software-Defined Storage data plane that performs data prefetching to memory.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…For the purpose of this work, we assume the preprocessing is performed on the CPU, while the training is performed on the GPUs. This is a common scenario [2,19,21,25,26,42] that makes efficient use of heterogeneous compute resources.…”
Section: Introductionmentioning
confidence: 99%