2019
DOI: 10.48550/arxiv.1912.12900
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

News from bottomonium spectral functions in thermal QCD

Abstract: New results on bottomonium at nonzero temperature are presented, using the FASTSUM Generation 2L ensembles. Preliminary results for spectral function reconstruction using Kernel Ridge Regression, a machine learning technique, are shown as well and compared to results from the Maximum Entropy Method.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5

Relationship

3
2

Authors

Journals

citations
Cited by 5 publications
(12 citation statements)
references
References 12 publications
0
12
0
Order By: Relevance
“…Other mock functions can be constructed; in Ref. [8] the logarithm of the spectral function was expanded in an orthonormal set of incomplete basis functions. Continuing with the Gaussian peaks here, we generate a collection of peaks by sampling values for {𝑍 𝑝 , 𝑚 𝑝 , Γ 𝑝 } from a set of distributions.…”
Section: Data Generationmentioning
confidence: 99%
See 1 more Smart Citation
“…Other mock functions can be constructed; in Ref. [8] the logarithm of the spectral function was expanded in an orthonormal set of incomplete basis functions. Continuing with the Gaussian peaks here, we generate a collection of peaks by sampling values for {𝑍 𝑝 , 𝑚 𝑝 , Γ 𝑝 } from a set of distributions.…”
Section: Data Generationmentioning
confidence: 99%
“…In Ref. [8] Kernel Ridge Regression (KRR) was introduced to reconstruct bottomonium spectral functions from Euclidean lattice correlators. KRR is a machine learning method that requires training data and here we revisit intricacies of generating this data.…”
Section: Introductionmentioning
confidence: 99%
“…As is common in machine learning paradigms, kernel ridge regression (KRR) infers a prediction of a quantity based on previously observed training data. In the case of solving the Details of the KRR approach in the context of NRQCD can be found in [15]. Here we note that we modified our method how the input data, the set of correlators G i (τ) (where i = 1, 2, .…”
Section: Kernel Ridge Regressionmentioning
confidence: 99%
“…, N train ), is combined into the matrix (or kernel) C before the regression procedure. In contrast to [15], we use here the following encoding for the matrix elements of C,…”
Section: Kernel Ridge Regressionmentioning
confidence: 99%
See 1 more Smart Citation