2019 27th European Signal Processing Conference (EUSIPCO) 2019
DOI: 10.23919/eusipco.2019.8902831
|View full text |Cite
|
Sign up to set email alerts
|

Harmonic Networks with Limited Training Samples

Abstract: Convolutional neural networks (CNNs) are very popular nowadays for image processing. CNNs allow one to learn optimal filters in a (mostly) supervised machine learning context. However this typically requires abundant labelled training data to estimate the filter parameters. Alternative strategies have been deployed for reducing the number of parameters and / or filters to be learned and thus decrease overfitting. In the context of reverting to preset filters, we propose here a computationally efficient harmoni… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 24 publications
0
12
0
Order By: Relevance
“…This enables efficient model compression by parameter truncation with only minor degradation in the model performance. This has been shown to be particularly useful for tasks with limited training samples [6].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This enables efficient model compression by parameter truncation with only minor degradation in the model performance. This has been shown to be particularly useful for tasks with limited training samples [6].…”
Section: Discussionmentioning
confidence: 99%
“…Email: vladimir.krylov@dcu.ie particular basis functions [3], [4] have mostly focused on ability to compress the network, we show furthermore that prior information coming from well chosen filter basis can not only be used for compression but can also speed up training convergence and improve performance. This paper presents a comprehensive study based on our earlier works [5], [6] on harmonic blocks 1 . In this work we introduce further analysis and substantially expand the experimental validation to include edge and object detection cases that broader demonstrate the benefits of harmonic networks.…”
Section: Introductionmentioning
confidence: 99%
“…The research community is actively working to solve this issue, proposing solutions that can be adopted on tasks that involve small data or low resources. Examples of such approaches are pre-trained models (e.g., transfer-learning [73]), unlabeled data (e.g., semisupervised learning [19]), custom training paradigms, or novel blocks for specific architectures [76]. The first category includes transfer learning techniques and few-shot learning.…”
Section: Image Classificationmentioning
confidence: 99%
“…Ulicny and Dahyot [20] used a CNN to classify images in the DCT domain. Ulicny et al [21,22] designed the so-called harmonic blocks to replace the traditional convolutions. These blocks generate features by learning combinations of spectral filters defined by DCT.…”
Section: Related Workmentioning
confidence: 99%