2018
DOI: 10.20944/preprints201810.0115.v2
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Science Pipelines for the Square Kilometre Array

Abstract: The Square Kilometre Array (SKA) will be both the largest radio telescope ever constructed and the largest Big Data project in the known Universe. The first phase of the project will generate on the order of 5 zettabytes of data per year. A critical task for the SKA will be its ability to process data for science, which will need to be conducted by science pipelines. Together with polarization data from the LOFAR Multifrequency Snapshot Sky Survey (MSSS), we have been developing a realistic SKA-like science pi… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 34 publications
0
5
0
Order By: Relevance
“…Developing scalable pipelines and workflows for HPC tasks involving large datasets has also been well-studied in the literature ( Farnes et al, 2018 ; Hendrix et al, 2016 ; Paraskevakos et al, 2019 ; Lyons et al., 2019 ). For example, the authors of ( Farnes et al, 2018 ) presented a technique for building scalable workflows for analyzing large volumes of satellite imagery data, while Lyons et al (2019) presented a system for analyzing workflows related to weather-sensing data. Other studies have presented generalized methodologies for building scalable workflows for tasks requiring HPC platforms ( Hendrix et al, 2016 ; Castellana et al, 2019 ).…”
Section: Performance Attributesmentioning
confidence: 99%
“…Developing scalable pipelines and workflows for HPC tasks involving large datasets has also been well-studied in the literature ( Farnes et al, 2018 ; Hendrix et al, 2016 ; Paraskevakos et al, 2019 ; Lyons et al., 2019 ). For example, the authors of ( Farnes et al, 2018 ) presented a technique for building scalable workflows for analyzing large volumes of satellite imagery data, while Lyons et al (2019) presented a system for analyzing workflows related to weather-sensing data. Other studies have presented generalized methodologies for building scalable workflows for tasks requiring HPC platforms ( Hendrix et al, 2016 ; Castellana et al, 2019 ).…”
Section: Performance Attributesmentioning
confidence: 99%
“…Developing scalable pipelines and workflows for HPC tasks involving large datasets has also been well-studied in literature [14], [21], [25], [29]. For example, the authors of [14] present a technique for building scalable workflows for analyzing large volumes of satellite imagery data, while [25] present a system for analyzing workflows related to weather-sensing data. Other studies have presented generalized methodologies for building scalable workflows for tasks requiring HPC platforms [5], [21].…”
Section: Related Workmentioning
confidence: 99%
“…Wide Field Survey Explorer (WISE, Wright et al 2010), which performed a photometric all-sky survey in near-and mid-infrared passbands de-livered over 23 TB 2 (excluding multi-epoch and reject catalogues). Soon, facilities that are currently under development, such as the Square Kilometre Array (SKA; Dewdney et al 2009) and the Large Synoptic Survey Telescope (LSST; Ivezic et al 2008) will provide even larger volumes of data: The LSST is expected to deliver in total 60 PB of raw images 3 , while the SKA would give over 5 ZB (Farnes et al, 2018). Such an exponential growth in the quantity of data has compelled astronomers to develop automatic tools for extracting knowledge about known objects as well as to discover new information.…”
Section: Introductionmentioning
confidence: 99%