2021
DOI: 10.48550/arxiv.2105.14782
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Auto-Differentiable Spectrum Model for High-Dispersion Characterization of Exoplanets and Brown Dwarfs

Hajime Kawahara,
Yui Kawashima,
Kento Masuda
et al.

Abstract: We present an auto-differentiable spectral modeling of exoplanets and brown dwarfs. This model enables a fully Bayesian inference of the high-dispersion data to fit the ab initio line-by-line spectral computation to the observed spectrum by combining it with the Hamiltonian Monte Carlo in recent probabilistic programming languages. An open source code, exojax , developed in this study, was written in Python using the GPU/TPU compatible package for automatic differentiation and accelerated linear algebra, JAX (… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 84 publications
0
4
0
Order By: Relevance
“…Fortunately, our model is computationally efficient to evaluate and, given its analyticity, trivial to differentiate using an autodifferentiation scheme (see §11). It can therefore be used out-of-the-box within gradient-based sampling schemes such as pymc3 (Salvatier et al 2016), in particular the No-U-Turn Sampler (NUTS), a variant of Hamiltonian Monte Carlo that harnesses gradient information to greatly speed up the convergence of the MCMC chains (for a recent application of HMC in the context of spectroscopy, see Kawahara et al 2021). Our model may also be useful in the context of variational inference (VI), and in particular automatic differentiation variational inference (ADVI; Kucukelbir et al 2016), which casts posterior inference as an optimization problem; both VI and ADVI are also implemented in pymc3.…”
Section: Posterior Samplingmentioning
confidence: 99%
“…Fortunately, our model is computationally efficient to evaluate and, given its analyticity, trivial to differentiate using an autodifferentiation scheme (see §11). It can therefore be used out-of-the-box within gradient-based sampling schemes such as pymc3 (Salvatier et al 2016), in particular the No-U-Turn Sampler (NUTS), a variant of Hamiltonian Monte Carlo that harnesses gradient information to greatly speed up the convergence of the MCMC chains (for a recent application of HMC in the context of spectroscopy, see Kawahara et al 2021). Our model may also be useful in the context of variational inference (VI), and in particular automatic differentiation variational inference (ADVI; Kucukelbir et al 2016), which casts posterior inference as an optimization problem; both VI and ADVI are also implemented in pymc3.…”
Section: Posterior Samplingmentioning
confidence: 99%
“…Modern deep learning libraries such as Tensorflow (Abadi et al 2015a), PyTorch (Paszke et al 2019) and JAX (Bradbury et al 2018) provide easy access to model training and evaluation. Our implementation will be solely based on the Tensorflow framework, other deep learning frameworks can also be used (Kawahara et al 2021).…”
Section: Variational Inferencementioning
confidence: 99%
“…There have been several recent attempts within the field of exoplanets to develop differentiable physical models within these frameworks. Initial results show that these differentiable models may hold the keys to overcome the curse of dimensionality brought by our increasingly complex models (Kawahara et al 2021) and may one day enable us to perform population study without placing significant demand on computational resources.…”
Section: A Word On Differentiable Frameworkmentioning
confidence: 99%
See 1 more Smart Citation