2022 4th International Conference on Smart Systems and Inventive Technology (ICSSIT) 2022
DOI: 10.1109/icssit53264.2022.9716555
|View full text |Cite
|
Sign up to set email alerts
|

Image Enhancement and Classification of CIFAR-10 Using Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 12 publications
0
1
0
Order By: Relevance
“…It implies that in other systems, which are traditionally manually designed, the network knows about filters. Because of the lower restriction on prior knowledge, CNN has a major benefit over others, and designing hand-engineered characteristics is difficult [13], [14], [28].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…It implies that in other systems, which are traditionally manually designed, the network knows about filters. Because of the lower restriction on prior knowledge, CNN has a major benefit over others, and designing hand-engineered characteristics is difficult [13], [14], [28].…”
Section: Methodsmentioning
confidence: 99%
“…The muddle codes greatly minimize the need for extra room to improve mathematical abilities [8], [9]. As of late, many hashing methods [10][11][12][13] have carried out impressive executions.…”
Section: Introductionmentioning
confidence: 99%
“…Han et al [16] employed a pruning technique to reduce the size of networks in AlexNet [17][20] by 9x without compromising accuracy in image classification. Similarly, Li et al [21] utilized filter pruning techniques to lower inference costs for DNN models, such as VGG-16 [22] and residual network (ResNet)-110 [23], on the CIFAR10 [24] image classification dataset. Despite its benefits, pruning the DNN architecture can result in accuracy loss and introduce complexity during model training.…”
Section: Introductionmentioning
confidence: 99%