2017 IEEE International Conference on Bioinformatics and Biomedicine (BIBM) 2017
DOI: 10.1109/bibm.2017.8217720
|View full text |Cite
|
Sign up to set email alerts
|

Visualization of disease relationships by multiple maps t-SNE regularization based on Nesterov accelerated gradient

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 6 publications
0
6
0
Order By: Relevance
“…Neighborhood Preservation Ratio (NPR) has been used as a measure of correctness in embeddings and to quantitatively assess the performance of different dimensionality reduction algorithms [35][36][37]. There are different ways to calculate NPR, for instance Shen et al [38] evaluated an NPR score that gives a general quality score for the whole embedding. In contrast, Maaten et al [35] calculated an NPR score per sample in the embedding which uses the ratio of preserved neighbors of each sample in both highand low-dimensional spaces.…”
Section: Neighborhood Preservation Ratiomentioning
confidence: 99%
“…Neighborhood Preservation Ratio (NPR) has been used as a measure of correctness in embeddings and to quantitatively assess the performance of different dimensionality reduction algorithms [35][36][37]. There are different ways to calculate NPR, for instance Shen et al [38] evaluated an NPR score that gives a general quality score for the whole embedding. In contrast, Maaten et al [35] calculated an NPR score per sample in the embedding which uses the ratio of preserved neighbors of each sample in both highand low-dimensional spaces.…”
Section: Neighborhood Preservation Ratiomentioning
confidence: 99%
“…They included t-distributed stochastic neighbor embedding (t-SNE), locally linear embedding (LLE), isometric mapping (ISOMAP), kernel principal component analysis (KPCA), and multidimensional scaling (MDS) [19]- [28]. At the same time, we included other four peer methods: principal component analysis (PCA), sparse PCA, nonnegative matrix factorization (NMF), and random projection in experiments for comparisons [21], [25], [26].…”
Section: Introductionmentioning
confidence: 99%
“…For dimensionality reduction purposes, the machine learning algorithm t‐SNE 38 transformed our 512‐dimentional feature vectors into 2‐dimentional data points. The images were subsequently embedded at their determined location in the GM, visually describing the similarity relationship between the images.…”
Section: Methodsmentioning
confidence: 99%