2014
DOI: 10.14529/jsfi140106
|View full text |Cite
|
Sign up to set email alerts
|

Exascale Storage Systems - An Analytical Study of Expenses

Abstract: 1The computational power and storage capability of supercomputers are growing at a different pace, with storage lagging behind; the widening gap necessitates new approaches to keep the investment and running costs for storage systems at bay. In this paper, we aim to unify previous models and compare different approaches for solving these problems. By extrapolating the characteristics of the German Climate Computing Center's previous supercomputers to the future, cost factors are identified and quantified in or… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 14 publications
0
6
0
Order By: Relevance
“…DeltaFS [188] is a file system that improves the scalability of file systems by letting compute nodes manage metadata instead of a centralized server (common in traditional distributed file systems). Kunkel et al [105] review three methods to improve I/O performance at exascale: re-computation (found to be beneficial for infrequent accesses),…”
Section: I/o Middleware I/o Performance Can Also Be Improved By Solumentioning
confidence: 99%
“…DeltaFS [188] is a file system that improves the scalability of file systems by letting compute nodes manage metadata instead of a centralized server (common in traditional distributed file systems). Kunkel et al [105] review three methods to improve I/O performance at exascale: re-computation (found to be beneficial for infrequent accesses),…”
Section: I/o Middleware I/o Performance Can Also Be Improved By Solumentioning
confidence: 99%
“…While lossless methods are often viewed as "safer" for scientific data, it is well known that lossless data compression of floating-point simulation data is difficult and often yields little benefit (e.g., Lindstrom and Isenburg, 2006;Bicer et al, 2013;Lakshminarasimhan et al, 2011). The reason for the relative ineffectiveness of lossless methods on scientific data (in contrast to image or audio data, for example) is that trailing digits of the fixed-precision floating-point output data are often essentially random, depending on the data type and the number of physically significant digits.…”
Section: Data Compressionmentioning
confidence: 99%
“…Random numbers are a liability for compression, thus giving lossy methods a significant advantage. Many recent efforts have focused on effectively applying or adapting lossy techniques for scientific datasets (e.g., Lakshminarasimhan et al, 2011;Iverson et al, 2012;Laney et al, 2013;Gomez and Cappello, 2013;Lindstrom, 2014). In the climate modeling community in particular, lossy data compression has been the subject of a number of recent studies (e.g., Woodring et al, 2011;Hübbe et al, 2013;Bicer et al, 2013;Baker et al, 2014;Kuhn et al, 2016;Silver and Zender, 2016;Zender, 2016), though we are not aware of comparable efforts on evaluating the effects on the scientific validity of the climate data and results.…”
Section: Data Compressionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the operator splitting approach of MESSy proves more powerful, also allowing for coupling of different domains, e.g. demonstrated by the integration of an ocean subsystem (Pozzer et al, 2011). An extension by Kerkweg and Jöckel (2012b) allowed for one-way coupling of different spatially nested domains using a server-client approach with point-to-point communication.…”
Section: Messymentioning
confidence: 99%