2020
DOI: 10.1109/access.2020.2983774
|View full text |Cite
|
Sign up to set email alerts
|

Sequence-Dropout Block for Reducing Overfitting Problem in Image Classification

Abstract: Overfitting is a common problem for computer vision applications It is a problem that when training convolution neural networks and is caused by lack of training data or network complexity. The novel sequence-dropout (SD) method is proposed in this paper to alleviate the problem of overfitting when training networks. The SD method works by dropping out units (channels of feature) from the network in a sequence, replacing the traditional operation of random omitting. Sophisticated aggregation strategies are use… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 34 publications
(12 citation statements)
references
References 30 publications
0
11
0
1
Order By: Relevance
“…The advantages also include efficiency enhancement by hyperparameter per round (of practice). On the other hand, small patches of the previous network caused the improper number of patches, failing to gather good data patches for transfer to the next layers [30,35].…”
Section: Discussionmentioning
confidence: 99%
“…The advantages also include efficiency enhancement by hyperparameter per round (of practice). On the other hand, small patches of the previous network caused the improper number of patches, failing to gather good data patches for transfer to the next layers [30,35].…”
Section: Discussionmentioning
confidence: 99%
“…Dropout layers were added after each activation function [ 30 ]. The reasons were twofold: to act as a regularizer to prevent overfitting and to measure the model uncertainty.…”
Section: Methodsmentioning
confidence: 99%
“…We have added regularization techniques in the proposed architecture like dropout, ridge regression, and batch normalization. Dropout helps [20] to prevent overfitting by randomly dropping out (setting to zero) some features (usually hidden units) during training. This forces the model to learn multiple independent representations of the same data, which reduces overfitting and improves generalization.…”
Section: Convolution and Pooling Filtersmentioning
confidence: 99%