2021
DOI: 10.1007/978-3-030-68780-9_16
|View full text |Cite
|
Sign up to set email alerts
|

Latent Space Geometric Statistics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…For prediction training with the geodesic loss function, we set the number of interpolated points to be four. In addition, to speed up the algorithm for faster training, we ran the loop in Algorithm 1 for a fixed number of iterations (15)(16)(17)(18)(19)(20) instead of until convergence.…”
Section: Prediction Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…For prediction training with the geodesic loss function, we set the number of interpolated points to be four. In addition, to speed up the algorithm for faster training, we ran the loop in Algorithm 1 for a fixed number of iterations (15)(16)(17)(18)(19)(20) instead of until convergence.…”
Section: Prediction Resultsmentioning
confidence: 99%
“…Recent work has shown that that distances in latent space are not representative of the true distance between datapoints [1,19,25]. Rather, deep generative models learn a mapping from the latent space to the data manifold, a smoothly varying lower-dimensional subset of the original data space.…”
Section: Data Manifold Learningmentioning
confidence: 99%
“…Given the high-dimensionality of the admission-level embeddings, we explore employing the dimensionality reduction algorithm UMAP to generate human-readable 2-dimensional plots of corpus manifolds for MIMIC-IV. The UMAP algorithm is known for preserving local structure between points while also maintaining more global structure compared to other dimensionality reduction methods like PCA 45 and t-SNE 46 . This property renders UMAP a valuable technique for interpreting encodings or embeddings of high-dimensional data and has been successfully applied in previous work to visualize biological datasets such as transcriptomes of single-cell data 47 .…”
Section: Methodsmentioning
confidence: 99%
“…The generator's core mandate is two-fold: firstly, it is tasked with the application of a feature extraction module to effectuate the mapping of input samples into latent space. In this process, vital feature information is meticulously encoded into latent vectors [42][43][44]. Subsequently, the generator employs an image reconstruction module to undertake the vital task of decoding feature information from these latent vectors, culminating in the meticulous reconstruction of the dehazed image.…”
Section: Methodsmentioning
confidence: 99%