2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2017
DOI: 10.1109/icassp.2017.7952548
|View full text |Cite
|
Sign up to set email alerts
|

On random weights for texture generation in one layer CNNS

Abstract: Recent work in the literature has shown experimentally that one can use the lower layers of a trained convolutional neural network (CNN) to model natural textures. More interestingly, it has also been experimentally shown that only one layer with random filters can also model textures although with less variability. In this paper we ask the question as to why one layer CNNs with random filters are so effective in generating textures? We theoretically show that one layer convolutional architectures (without a n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 8 publications
0
4
0
Order By: Relevance
“…The prediction output of RNN at each moment is composed of two parts, one is from the input of the current moment, and the other is from the hidden state of the last moment. At present, RNN shows great advantages in speech recognition, machine translation, text language processing, and other fields [14,15].…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…The prediction output of RNN at each moment is composed of two parts, one is from the input of the current moment, and the other is from the hidden state of the last moment. At present, RNN shows great advantages in speech recognition, machine translation, text language processing, and other fields [14,15].…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…They found that during optimization for a style transfer task, the perceptual loss with a proper weight scale can work well with untrained, randomly-weighted networks, and generate competitive results as prior work [8] with pretrained weights. Mongia et al [19] prove why one-layer CNNs with random weights can successfully generate textures. The randomly-weighted networks are also employed in unsupervised learning [29] and reinforcement learning [7].…”
Section: Related Workmentioning
confidence: 99%
“…In the field of deep learning, networks with random weights have also shown preferable properties [27,28,29]. Giryes et al [27] found networks with random Gaussian filters can perform distance-preserving embedding of the original data while training is to find hyperplanes to separate different data points.…”
Section: Introductionmentioning
confidence: 99%
“…Gilbert et al [28] connected networks with random Gaussian filters for compressed sensing and analyzed the invertibility of networks, in which they observed that networks with random Gaussian filters could achieve good classification performance. Mongia et al [29] applied random weights for static texture generation.…”
Section: Introductionmentioning
confidence: 99%