2022
DOI: 10.1109/access.2022.3197219
|View full text |Cite
|
Sign up to set email alerts
|

A New Pointwise Convolution in Deep Neural Networks Through Extremely Fast and Non Parametric Transforms

Abstract: Some conventional transforms such as Discrete Walsh-Hadamard Transform (DWHT) and Discrete Cosine Transform (DCT) have been widely used as feature extractors in image processing but rarely applied in neural networks. However, we found that these conventional transforms can serve as a powerful feature extractor in channel dimension without any learnable parameters in deep neural networks. This paper firstly proposes to apply conventional transforms on pointwise convolution, showing that such transforms can sign… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 49 publications
0
2
0
Order By: Relevance
“…Comparative studies of different SPI reconstructed methods Moreover, HSI under GPU acceleration has shown that it depends on the specific application requirements, including the nature of the data, computational environment, and Deep Learning [41,42]. With its simpler arithmetic operations, the HSI often exhibits advantages in scenarios where the computation can be entirely kept within the real domain.…”
Section: Related Workmentioning
confidence: 99%
“…Comparative studies of different SPI reconstructed methods Moreover, HSI under GPU acceleration has shown that it depends on the specific application requirements, including the nature of the data, computational environment, and Deep Learning [41,42]. With its simpler arithmetic operations, the HSI often exhibits advantages in scenarios where the computation can be entirely kept within the real domain.…”
Section: Related Workmentioning
confidence: 99%
“…For training samples, it is not easy to get the required data set, which makes the network fall into the local optimal solution to find the best parameter set. In the process of practical application, it is usually necessary to determine the objective function value after multiple iterations [17][18]. When we do not want this parameter to achieve our desired purpose, we cannot use this method.…”
Section: Convolution Neural Network Algorithmmentioning
confidence: 99%