2020
DOI: 10.1088/1748-0221/15/07/c07038
|View full text |Cite
|
Sign up to set email alerts
|

New challenges for distributed computing at the CMS experiment

Abstract: The Large Hadron Collider (LHC) experiments soon step into the next period of run-3 data-taking with an increased data rate and high pileup requiring an excellent working computing infrastructure. In the future High-Luminosity LHC (HL-LHC) data-taking period, the compute, storage and network facilities have to be further extended by large factors and flexible and sophisticated computing models are essential. New techniques of modern state-of-the-art methods in physics analysis and data science, Deep Learning a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 6 publications
0
3
0
Order By: Relevance
“…The challenges to face were (and still are): to keep operating the system reliably enough with significantly less effort, to evolve towards new, more open and flexible computing models, and to provide the software and the resources needed by the LHC experiments. In terms of data processing, the rate of advances in hardware performance has slowed in recent years, encouraging the community to adapt and to take advantage of developments such as graphics processing unit (GPU), high-performance computing (HPC) and commercial Cloud services [44]. Cloud computing is a technology largely viewed as the next big step in the development and deployment of an increasing number of distributed applications.…”
Section: Future Challenges: Generic Detector Randd Computing and Soft...mentioning
confidence: 99%
“…The challenges to face were (and still are): to keep operating the system reliably enough with significantly less effort, to evolve towards new, more open and flexible computing models, and to provide the software and the resources needed by the LHC experiments. In terms of data processing, the rate of advances in hardware performance has slowed in recent years, encouraging the community to adapt and to take advantage of developments such as graphics processing unit (GPU), high-performance computing (HPC) and commercial Cloud services [44]. Cloud computing is a technology largely viewed as the next big step in the development and deployment of an increasing number of distributed applications.…”
Section: Future Challenges: Generic Detector Randd Computing and Soft...mentioning
confidence: 99%
“…The advent of computing and information technology, especially the widespread adoption of internet technology, digital platforms, mobile devices and cloud computing, has fueled explosive growth in Generate data in many different fields. This increase in data production manifests itself in many different types, characterized by extensive noise and high complexity [1].…”
mentioning
confidence: 99%
“…In addition, a provincial police office accumulated 20 billion pieces of road vehicle monitoring data over three years, a total of 120 TB. Forecasts from IDC, a leading computer information analysis and consulting company, predict that the global annual volume of data will reach 35 ZB by 2020 [1]. The term "big data" has emerged to summarize the nature of such large, unstructured, digitized data sets.…”
mentioning
confidence: 99%