2015 International Joint Conference on Neural Networks (IJCNN) 2015
DOI: 10.1109/ijcnn.2015.7280688
|View full text |Cite
|
Sign up to set email alerts
|

Normal sparse Deep Belief Network

Abstract: Nowadays this is very popular to use deep architectures in machine learning. Deep Belief Networks (DBNs) have deep architectures to create a powerful generative model using training data. Deep Belief Networks can be used in classification and feature learning. A DBN can be learnt unsupervised and then the learnt features are suitable for a simple classifier (like a linear classifier) with a few labeled data. According to researches, training of DBN can be improved to produce features with more interpretability… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 41 publications
(6 citation statements)
references
References 11 publications
0
6
0
Order By: Relevance
“…We tested our proposed method on Matlab R2016b and used the DeeBNET Toolbox [18]. All experiments were run on a 64-bit Windows 10 PC with an Intel Core i7-6500 CPU, 8 GB of RAM and an NVIDIA GeForce 920MX (2GB) graphics card.…”
Section: Tests and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…We tested our proposed method on Matlab R2016b and used the DeeBNET Toolbox [18]. All experiments were run on a 64-bit Windows 10 PC with an Intel Core i7-6500 CPU, 8 GB of RAM and an NVIDIA GeForce 920MX (2GB) graphics card.…”
Section: Tests and Resultsmentioning
confidence: 99%
“…finding the most efficient compact representation for the input data). A DBN auto-encoder [18] is a model comprising auto-encoder RBMs that permit creation of a generative model for extracting features from the encrypted data. Usually, the data vector is saved in the last hidden layer.…”
Section: Figure 2 Schematic Representation Of a Dbn Modelmentioning
confidence: 99%
“…Adachi et al [22] used samples from a D-Wave quantum annealing machine to estimate model expectations of Restricted Boltzmann Machines. Keyvanrad et al [23] developed a new model named nsDBN that has different behaviors according to deviation of the activation of the hidden units from a fixed value. Meanwhile, the model has a variance parameter that can control the force degree of sparseness.…”
Section: Hybrid Modelmentioning
confidence: 99%
“…All performed experiments were performed using MAT-LAB/Octave toolboxes: MIR Toolbox [31] for spectrogram extraction from music files, Sparse Representation Toolbox [32] for the L1 regularized least squares implementation and DeeBNet [33] for the implementation of autoencoder and sparse RBM.…”
Section: A Common Conditions Of the Experimentsmentioning
confidence: 99%