2019
DOI: 10.1051/epjconf/201921402003
|View full text |Cite
|
Sign up to set email alerts
|

The Data Quality Monitoring Software for the CMS experiment at the LHC: past, present and future

Abstract: The Data Quality Monitoring software is a central tool in the CMS experiment. It is used in the following key environments: (i) Online, for real-time detector monitoring; (ii) Offline, for the prompt-offline-feedback and final fine-grained data quality analysis and certification; (iii) Validation of all the reconstruction software production releases; (iv) Validation in Monte Carlo productions. Though the basic structure of the Run1 DQM system remains the same for Run2, between the Run1 and Run2 periods, the D… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(15 citation statements)
references
References 9 publications
0
15
0
Order By: Relevance
“…To maintain a good amount of quality data for physics analyses, CMS Collaboration has DQM [5] and DC workflows. Both workflows oversee live DQM during data-taking (also called online operations), and data certification after the data has been recorded from the detector and fully reconstructed (also called offline operations During collisions, the CMS detector sends a small portion of the data stream to the online DQM backend.…”
Section: Current Workflow Of Data Quality Monitoring and Data Certifi...mentioning
confidence: 99%
“…To maintain a good amount of quality data for physics analyses, CMS Collaboration has DQM [5] and DC workflows. Both workflows oversee live DQM during data-taking (also called online operations), and data certification after the data has been recorded from the detector and fully reconstructed (also called offline operations During collisions, the CMS detector sends a small portion of the data stream to the online DQM backend.…”
Section: Current Workflow Of Data Quality Monitoring and Data Certifi...mentioning
confidence: 99%
“…Nonetheless, without sustained effort for two years, a new ROOT histogram library cannot be advanced to the minimal feature level required for adoption. The community risks continued work around the ownership discrepancies of ROOT's histograms, paying a performance price for analyses, and additional fragmentation in the experiments' online and offline framework, where even today, experiment specific histogramming facilities have been reintroduced to not use ROOT's histogram library [25]. CMS has provided a review of issues of ROOT histograms for DQM.…”
Section: Histogrammingmentioning
confidence: 99%
“…In the recent years, several studies have demonstrated the benefit of using Deep Learning (DL) to solve typical tasks related to data taking and analysis. Building on these examples, many HEP experiments are now working on integrating DL into their workflows for many different applications: from data quality assurance [38], to real-time selection of interesting collision events [39], to simulation [40] and data analysis [41]. For example, generative models, from GAN to Variational Auto-Encoders (VAE), are being tested as fast alternatives to Monte Carlo based simulation.…”
Section: Related Work a Machine Learning In Scientific Applicationsmentioning
confidence: 99%