2019 18th IEEE International Conference on Machine Learning and Applications (ICMLA) 2019
DOI: 10.1109/icmla.2019.00265
|View full text |Cite
|
Sign up to set email alerts
|

CSNNs: Unsupervised, Backpropagation-Free Convolutional Neural Networks for Representation Learning

Abstract: This work combines Convolutional Neural Networks (CNNs), clustering via Self-Organizing Maps (SOMs) and Hebbian Learning to propose the building blocks of Convolutional Self-Organizing Neural Networks (CSNNs), which learn representations in an unsupervised and Backpropagation-free manner. Our approach replaces the learning of traditional convolutional layers from CNNs with the competitive learning procedure of SOMs and simultaneously learns local masks between those layers with separate Hebbian-like learning r… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(17 citation statements)
references
References 24 publications
0
17
0
Order By: Relevance
“…If these are provided they can potentially be used, but, typically, an unsupervised model should be is in position to function even when these are absent or missing. Nevertheless, there is merely a handful of approaches that adhere to strict unsupervised training criteria [5], [6], [7]and the ones reporting results on the MNIST database are specifically indicated in Table 1. Frequently, in the literature, an "unsupervised" model with a supervised or self/semi-supervised training procedure is proposed.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…If these are provided they can potentially be used, but, typically, an unsupervised model should be is in position to function even when these are absent or missing. Nevertheless, there is merely a handful of approaches that adhere to strict unsupervised training criteria [5], [6], [7]and the ones reporting results on the MNIST database are specifically indicated in Table 1. Frequently, in the literature, an "unsupervised" model with a supervised or self/semi-supervised training procedure is proposed.…”
Section: Methodsmentioning
confidence: 99%
“…building a deep SOM and training it in a purely unsupervised way has proven to be a complex and difficult task. Only a small number of models exist that can be classified as unsupervised beyond any doubt [5], [6]and [7]. Equally few are the approaches that extend beyond the three hidden layer limit [8], [9] and [6].…”
Section: Introductionmentioning
confidence: 99%
“…They have been well studied and their early-stage testing value is undoubtable, but when used in isolation they might offer a partial biased view of a model's capabilities. For instance, their grayscale characteristic (i.e., that they consist of single-channel images) conceals the fact that a substantial number of deep SOMs are not in a position to process and model colored images whereas a handful of advanced ones like [14][15][16]47] succeed in doing. Nevertheless, preliminary results for the MNIST benchmark of a pilot SOCOM study can be found in [48].…”
Section: Neural Output Visualizationmentioning
confidence: 99%
“…Meeting both main objectives i.e., building a deep SOM and training it in a purely unsupervised way has proven a complex and difficult task. Only a small number of models exist that can be classified as unsupervised beyond any doubt [12][13][14]. Equally few are the approaches that extend beyond the three hidden layer limit [13,15,16].…”
Section: Introductionmentioning
confidence: 99%
“…By using fusion method, the classification accuracy may be increased. We evaluated the methods on some real datasets that include CIFAR-10 [11-13] and CIFAR-100 [14][15][16]. Furthermore, we also collected a real dataset where the samples have different resolution and each label has different number of samples.…”
Section: Introductionmentioning
confidence: 99%