2014
DOI: 10.1016/j.newast.2013.09.007
|View full text |Cite
|
Sign up to set email alerts
|

Stellar spectral subclasses classification based on Isomap and SVM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 40 publications
(17 citation statements)
references
References 9 publications
0
17
0
Order By: Relevance
“…It has been extensively used in data science and made a break into astronomy as a classification algorithm (Matijevič et al 2017;Lochner et al 2016;Valentini et al 2017;Traven et al 2017), along with other manifold learning algorithms (e.g. Vanderplas & Connolly 2009;Daniel et al 2011;Bu et al 2014). We extend its use as a pure manifold learning algorithm to find structure in a 13-dimensional Cspace.…”
Section: T-distributed Stochastic Neighbour Embedding (T-sne)mentioning
confidence: 99%
“…It has been extensively used in data science and made a break into astronomy as a classification algorithm (Matijevič et al 2017;Lochner et al 2016;Valentini et al 2017;Traven et al 2017), along with other manifold learning algorithms (e.g. Vanderplas & Connolly 2009;Daniel et al 2011;Bu et al 2014). We extend its use as a pure manifold learning algorithm to find structure in a 13-dimensional Cspace.…”
Section: T-distributed Stochastic Neighbour Embedding (T-sne)mentioning
confidence: 99%
“…Thus, in contrast to MDS, Isomap can also capture non-linear manifold structures. In astronomy, it was recently applied to spectroscopic classification by Bu et al (2014).…”
Section: Isomapmentioning
confidence: 99%
“…The resulting pairwise geodesic distance matrix is then transformed using classical multidimensional scaling [48]. Isomap was used for instance in classification of stellar spectral subclasses in SDSS data [5] and for discovering white dwarf + main sequence for the same survey [53]. In both cases as the classification engine Support Vector Machine method was employed, with the superiority of this solution over the one using PCA being demonstrated once more.…”
Section: Techniques Of Feature Extractionmentioning
confidence: 99%