2014
DOI: 10.1109/tnnls.2013.2293637
|View full text |Cite
|
Sign up to set email alerts
|

On the Complexity of Neural Network Classifiers: A Comparison Between Shallow and Deep Architectures

Abstract: Recently, researchers in the artificial neural network field have focused their attention on connectionist models composed by several hidden layers. In fact, experimental results and heuristic considerations suggest that deep architectures are more suitable than shallow ones for modern applications, facing very complex problems, e.g., vision and human language understanding. However, the actual theoretical results supporting such a claim are still few and incomplete. In this paper, we propose a new approach to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
262
1
1

Year Published

2015
2015
2019
2019

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 453 publications
(267 citation statements)
references
References 35 publications
3
262
1
1
Order By: Relevance
“…It was noted in [6] that the VC dimension proposed for Neural Networks is also applicable to Deep Neural Networks. It was shown in [2] that for neural nets with sigmoidal activation function, the VC-dimension is loosely upper-bounded by O(w 4 ) where w is the number of free parameters in the network.…”
Section: Vc Dimension Of Deep Neural Net-work and Classification Accmentioning
confidence: 99%
“…It was noted in [6] that the VC dimension proposed for Neural Networks is also applicable to Deep Neural Networks. It was shown in [2] that for neural nets with sigmoidal activation function, the VC-dimension is loosely upper-bounded by O(w 4 ) where w is the number of free parameters in the network.…”
Section: Vc Dimension Of Deep Neural Net-work and Classification Accmentioning
confidence: 99%
“…RobERtmimics human recognition of spectroscopic features by using a pre-trained, deep-belief neural network (Hinton 2006(Hinton , 2007Bengio et al 2007;Le Roux & Bengio 2010;Montavon et al 2012;Bianchini & Scarselli 2014) at its core. DBNs are multi-layer, nonlinear transformations of the input data, the emission spectrum in this case, where each consecutive layer presents a progressively higher level of abstraction of the underlying features in the spectrum.…”
Section: Robertmentioning
confidence: 99%
“…For some algorithms, a formal time complexity analysis has been performed [23,8,4,18]. This analysis can be used in choosing an algorithm for a prediction.…”
Section: Time Complexity Of ML Algorithmsmentioning
confidence: 99%