2016
DOI: 10.18178/ijmlc.2016.6.4.601
|View full text |Cite
|
Sign up to set email alerts
|

Random Projections for Non-linear Dimensionality Reduction

Abstract: Abstract-The need to analyze high-dimensional data in various areas, such as image processing, human gene regulation and smart grids, raises the importance of dimensionality reduction. While classical linear dimensionality reduction methods are easily implementable and efficiently computable, they fail to discover the true structure of high-dimensional data lying on a non-linear subspace. To overcome this issue, many non-linear dimensionality reduction approaches, such as Locally Linear Embedding, Isometric Em… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 16 publications
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…In other words, we can distill the knowledge from one model (massive or teacher model) to another (small or student model). Previous work has shown that KD can significantly boost prediction accuracy in natural language processing and speech processing (Kim and Rush, 2016;Hu et al, 2018;Huang et al, 2018b;Hahn and Choi, 2019;Liu et al, 2021b,a;Cheng et al, 2016b;Cheng and You, 2016;Cheng et al, 2016a;You et al, 2020bYou et al, , 2021e, 2022bYou et al, , 2019aLyu et al, 2018Lyu et al, , 2019Guha et al, 2020;Yang et al, 2020;Ma et al, 2021a,b), while adopting KD-based methods for SQA tasks has been less explored. In this work, our goal is to handle the SCQA tasks.…”
Section: Spoken Question Answeringmentioning
confidence: 99%
“…In other words, we can distill the knowledge from one model (massive or teacher model) to another (small or student model). Previous work has shown that KD can significantly boost prediction accuracy in natural language processing and speech processing (Kim and Rush, 2016;Hu et al, 2018;Huang et al, 2018b;Hahn and Choi, 2019;Liu et al, 2021b,a;Cheng et al, 2016b;Cheng and You, 2016;Cheng et al, 2016a;You et al, 2020bYou et al, , 2021e, 2022bYou et al, , 2019aLyu et al, 2018Lyu et al, , 2019Guha et al, 2020;Yang et al, 2020;Ma et al, 2021a,b), while adopting KD-based methods for SQA tasks has been less explored. In this work, our goal is to handle the SCQA tasks.…”
Section: Spoken Question Answeringmentioning
confidence: 99%
“…-Non-linear Random Projections: RP-based techniques have been used to capture non-linear features in a compact representation. Approaches range from RP-based preprocessing for existing non-linear dimensionality reduction methods [22] to ad-hoc variants for non-linear kernel functions [5,23]. -Structured Johnson-Lindestrauss: Following the work of [24], structured JL methods try to approximate the result of a traditional RP by decomposing the projection matrix into a set of low-memory matrices [25,26].…”
Section: Random Projection Variantsmentioning
confidence: 99%