2023
DOI: 10.3390/s23031325
|View full text |Cite
|
Sign up to set email alerts
|

Less Is More: Adaptive Trainable Gradient Dropout for Deep Neural Networks

Abstract: The undeniable computational power of artificial neural networks has granted the scientific community the ability to exploit the available data in ways previously inconceivable. However, deep neural networks require an overwhelming quantity of data in order to interpret the underlying connections between them, and therefore, be able to complete the specific task that they have been assigned to. Feeding a deep neural network with vast amounts of data usually ensures efficiency, but may, however, harm the networ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 24 publications
0
1
0
Order By: Relevance
“…Dropout layers [79] are a powerful and intuitive regularization technique. Dropout operates on the principle of randomly omitting units and their connections within the neural network during the training phase.…”
Section: Dropout Layermentioning
confidence: 99%
“…Dropout layers [79] are a powerful and intuitive regularization technique. Dropout operates on the principle of randomly omitting units and their connections within the neural network during the training phase.…”
Section: Dropout Layermentioning
confidence: 99%