2006
DOI: 10.1117/12.658916
|View full text |Cite
|
Sign up to set email alerts
|

A heuristic neural network initialization scheme for modeling nonlinear functions in engineering mechanics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2007
2007
2014
2014

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 16 publications
0
6
0
Order By: Relevance
“…The two options for prototypes shown in Fig. 2 for the approximation of a fractional power nonlinearity have been presented in Reference [26]. There appears to be a many-to-many relationship between types of nonlinearities and prototypes, which should be confirmed in future studies.…”
Section: Multiple Options In Selecting Prototypes and Variantsmentioning
confidence: 91%
See 3 more Smart Citations
“…The two options for prototypes shown in Fig. 2 for the approximation of a fractional power nonlinearity have been presented in Reference [26]. There appears to be a many-to-many relationship between types of nonlinearities and prototypes, which should be confirmed in future studies.…”
Section: Multiple Options In Selecting Prototypes and Variantsmentioning
confidence: 91%
“…All ten types of nonlinearities were successfully trained using the proposed methodology. While typical examples of directly adopting the proposed prototypes can be found in References [25,26,27], the way to utilize the proposed decomposition technique is the focus of this section. All training was carried out using the Matlab Neural Network Toolbox [19] with batch training mode and the Levenberg-Marquardt backpropagation algorithm [28].…”
Section: Training Examplesmentioning
confidence: 99%
See 2 more Smart Citations
“…Commonly used methods in data mining comprise: Artificial NNs: NNs are biologically inspired predictive models (mimicking the functioning of human brain) that can learn and map linear and nonlinear functions14; Generalized rule induction: generates rules about significant relationships, rather than just predicting a class value5; Top‐down induction of decision trees: induces classification rules in the intermediate form of a tree structure15; k ‐Nearest neighbor: it classifies unlabeled data instances with a majority class of the k most similar data instances in the training set6; Other methods comprise genetic algorithms16 used for the optimization of data mining algorithms, rough sets, fuzzy set approaches for classification,5 Bayesian classification,5 etc. …”
Section: Knowledge Discovery and Data Miningmentioning
confidence: 99%