The 27th Chinese Control and Decision Conference (2015 CCDC) 2015
DOI: 10.1109/ccdc.2015.7162428
|View full text |Cite
|
Sign up to set email alerts
|

An optimized dimensionality reduction model for high-dimensional data based on Restricted Boltzmann Machines

Abstract: For high-dimensional data analysis, dimensionality reducing is a common optimization means. A number of traditional multivariate statistical based approaches are applied and proposed recently, but cannot be solving dimensionality reduction problem well. The difficulty is caused by the fact that high-dimensional data generally do not have specific distribution or enough prior information. Aiming at the problem, an optimized dimensionality reduction model based on Restricted Boltzmann Machines (RBM) is presented… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 22 publications
0
3
0
Order By: Relevance
“…The RBM was invented by Hinton in 2007 for learning a probability distribution over its set of inputs [ 95 ]. It is a generative stochastic artificial neural network that has wide applications in different areas such as dimensionality reduction [ 96 ], classification [ 97 ], regression [ 98 ], collaborative filtering [ 99 ], feature learning [ 100 ], and topic modeling [ 101 ].…”
Section: Deep Learning Models and Methodsmentioning
confidence: 99%
“…The RBM was invented by Hinton in 2007 for learning a probability distribution over its set of inputs [ 95 ]. It is a generative stochastic artificial neural network that has wide applications in different areas such as dimensionality reduction [ 96 ], classification [ 97 ], regression [ 98 ], collaborative filtering [ 99 ], feature learning [ 100 ], and topic modeling [ 101 ].…”
Section: Deep Learning Models and Methodsmentioning
confidence: 99%
“…Different methods for the feature selection have been mentioned in the published references such as multidimensional scaling [76], independent component analysis [77], latent semantic indexing [78], and partial least square [79], and PCA [80]. The related literature review showed that PCA is one of the best statistical techniques for feature reduction [81][82][83][84][85]. In this study, PCA was used to extract the principal components (PCs), to be used as predictors in order to make an ANN model more effective in predicting the energy output of Iranian tea production.…”
Section: Principal Component Analysismentioning
confidence: 99%
“…The typical Euclidean distance for similarity measure is inefficient when the number of variables is large, and the number of samples is relevantly small [3]; 2. The computational complexity of the algorithm increases with the number of dimensions [4]; and 3. It is difficult to determine the cluster centroids if the data values are sparse, i.e., only a small number of data entries having a non-null value.…”
Section: Introductionmentioning
confidence: 99%