2018
DOI: 10.1093/mnras/sty3217
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning of multi-element abundances from high-resolution spectroscopic data

Abstract: Deep learning with artificial neural networks is increasingly gaining attention, because of its potential for data-driven astronomy. However, this methodology usually does not provide uncertainties and does not deal with incompleteness and noise in the training data. In this work, we design a neural network for high-resolution spectroscopic analysis using APOGEE data that mimics the methodology of standard spectroscopic analyses: stellar parameters are determined using the full wavelength range, but individual… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
95
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 147 publications
(98 citation statements)
references
References 41 publications
(56 reference statements)
2
95
1
Order By: Relevance
“…Artificial neural network (ANN) methods were firstly adopted to determine stellar atmospheric parameters by Bailer- Jones et al (1997), and rejuvenated recently because of development of new training techniques and hardware. Inspired by the successful application of con- volutional neural networks (CNN) to APOGEE spectra (Fabbro et al 2018;Leung & Bovy 2019), we design a specific CNN structure for transferring stellar labels from APOGEE-payne catalog to LAMOST-II MRS spectra.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Artificial neural network (ANN) methods were firstly adopted to determine stellar atmospheric parameters by Bailer- Jones et al (1997), and rejuvenated recently because of development of new training techniques and hardware. Inspired by the successful application of con- volutional neural networks (CNN) to APOGEE spectra (Fabbro et al 2018;Leung & Bovy 2019), we design a specific CNN structure for transferring stellar labels from APOGEE-payne catalog to LAMOST-II MRS spectra.…”
Section: Methodsmentioning
confidence: 99%
“…Researchers made efforts on machine learning in stellar parameter estimation, such as The Cannon (Ness et al 2015;Casey et al 2016), The Payne , StarNet (Fabbro et al 2018), AstroNN (Leung & Bovy 2019) and GSN (Wang et al 2019a), and most of them employ artificial neural networks for building regression map relationship. These methods depend on training and test sets usually called reference sets, and the more complete the parameter space covers, the more information can be obtained by the model training.…”
Section: Introductionmentioning
confidence: 99%
“…Stars observed by APOGEE-2 are assigned chemical abundances by the APOGEE Stellar Parameter and Chemical Abundances Pipeline (ASPCAP -García Pérez et al 2016). However in this work we make use of abundances estimated from the spectra by the astroNN deep learning package (Leung & Bovy 2019b https://github.com/ henrysky/astroNN), which was trained on the results of AS-PCAP but is significantly faster and obtains higher precision abundances than ASPCAP even when the signal to noise ratio of a spectrum is below APOGEE's target of 100.…”
Section: Observationsmentioning
confidence: 99%
“…On-sky positions (RA, Dec) and line-of-sight velocities are taken directly from APOGEE and proper motions are taken from Gaia DR2. We make use of stellar distances as calculated by Leung & Bovy (2019a), who use the astroNN deep learning package to estimate distances using both spectra from APOGEE and photometry from 2MASS (Skrutskie et al 2006), by training on Gaia parallax data.…”
Section: Observationsmentioning
confidence: 99%
“…In such a situation, deep learning is gaining popularity even in the field of astronomy (e.g. Hezaveh et al 2017;George & Huerta 2018;Schaefer et al 2018;Leung & Bovy 2019). Similar to other machine learning techniques, it trains a computer to make intelligent judgements and recognize the pattern embedded in the data.…”
Section: Introductionmentioning
confidence: 99%