1992
DOI: 10.1007/bf00993256
|View full text |Cite
|
Sign up to set email alerts
|

A reply to Honavar's book review of Neural Network Design and the Complexity of Learning

Abstract: I want to take this opportunity to reinforce a point that is perhaps not stated clearly enough in my book. It is a point that Honavar (and another reviewer) have rightfully pursued.The NP-completeness is with respect to the size of the network and the amount of data in the training set to be loaded. Honavar points out that this measure of complexity of the input is somewhat suspect because "it does not reflect the.., inherent regularities of the data items being loaded; nor can it capture possible effects of c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Year Published

1992
1992
2015
2015

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(16 citation statements)
references
References 0 publications
0
16
0
Order By: Relevance
“…Second, the time required for the network to learn the required connection weights may become unrealistically long (Judd, 1990).…”
Section: Scalabilitymentioning
confidence: 99%
“…Second, the time required for the network to learn the required connection weights may become unrealistically long (Judd, 1990).…”
Section: Scalabilitymentioning
confidence: 99%
“…The work in this paper is inspired by Judd (1990) who shows the following problem to be NP-complete: "Given a neural network and a set of training examples, does there exist a set of edge weights for the network so that the network produces the correct output for all the training examples?" Judd shows that the problem remains NP-complete even if the network is only required to produce the correct output for two-thirds of the training examples, which implies that even approximately training a neural network is intrinsically difficult in the worst case (Judd, 1988).…”
Section: Previous Workmentioning
confidence: 99%
“…in [15,78]. Moreover, the results presented in this section apply only to a restricted set of SNN models and, apart from the programmability of transmission delays of synaptic connections, they do not cover all the capabilities of SNNs that could result from computational units based on firing times.…”
Section: Complexity Results Versus Real-world Performancementioning
confidence: 99%
“…• Calculability: NNs computational power outperforms a Turing machine [154] • Complexity: The "loading problem" is NP-complete [15,78] • Capacity: MLP, RBF and WNN 1 are universal approximators [35,45,63] • Regularization theory [132]; PAC-learning 2 [171]; Statistical learning theory, VC-dimension, SVM 3 [174] Nevertheless, traditional neural networks suffer from intrinsic limitations, mainly for processing large amount of data or for fast adaptation to a changing environment. Several characteristics, such as iterative learning algorithms or artificially designed neuron model and network architecture, are strongly restrictive compared with biological processing in natural neural networks.…”
Section: Traditional Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation