2023
DOI: 10.1016/j.neucom.2023.126327
|View full text |Cite
|
Sign up to set email alerts
|

Deep neural networks in the cloud: Review, applications, challenges and research directions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(3 citation statements)
references
References 201 publications
0
3
0
Order By: Relevance
“…Artificial neural networks are computational models inspired by biological neural networks that consist of a series of interconnected simple processing elements called neurons or nodes [ 24 ] . As it is well-known, one of the main advantages of neural networks lays in their ability to represent both linear and non-linear models by learning directly from data measurements [ 25 ].…”
Section: Materials and Methodsmentioning
confidence: 99%
“…Artificial neural networks are computational models inspired by biological neural networks that consist of a series of interconnected simple processing elements called neurons or nodes [ 24 ] . As it is well-known, one of the main advantages of neural networks lays in their ability to represent both linear and non-linear models by learning directly from data measurements [ 25 ].…”
Section: Materials and Methodsmentioning
confidence: 99%
“…Deep neural networks require a very large number of samples and computational resources in order to train the model [180,181]. In many fields, we do not have access to a large collection of samples that could be used in the learning process.…”
Section: Deep Transfer Learningmentioning
confidence: 99%
“…Labeled audio data can be acquired through human annotation, but this is costly, especially for problems with idiosyncratic or unusual sound classes where existing data has little utility. 5 For such specialized classification tasks, new labeled data must be collected specifically for that problem, increasing the per-task annotation cost since the data has minimal reuse value across tasks.…”
Section: Introductionmentioning
confidence: 99%