2001
DOI: 10.1016/s0045-7825(01)00248-1
|View full text |Cite
|
Sign up to set email alerts
|

Neural-network-based reliability analysis: a comparative study

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
80
0
2

Year Published

2008
2008
2022
2022

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 232 publications
(82 citation statements)
references
References 11 publications
0
80
0
2
Order By: Relevance
“…Hurtado and Alvarez [28] tested both and concluded that RBF networks have the following desirable features: high training speed, very small error, accuracy in probability estimation and robustness with respect to changes in model parameters, training sample size and generation procedures. In a subsequent work [1], the same authors analyse, compare and classify not only network types, cost functions, optimization algorithms, sampling methods but also different purposes of use of ANNs. They also recommend procedures for applying ANN in the structural reliability calculations.…”
Section: Types Of the Annsmentioning
confidence: 99%
See 1 more Smart Citation
“…Hurtado and Alvarez [28] tested both and concluded that RBF networks have the following desirable features: high training speed, very small error, accuracy in probability estimation and robustness with respect to changes in model parameters, training sample size and generation procedures. In a subsequent work [1], the same authors analyse, compare and classify not only network types, cost functions, optimization algorithms, sampling methods but also different purposes of use of ANNs. They also recommend procedures for applying ANN in the structural reliability calculations.…”
Section: Types Of the Annsmentioning
confidence: 99%
“…Therefore, (x) = 0 is n-dimensional surface that divides the domain into the safety region ( (x) > 0) and failure region ( (x) < 0) (see Figure 1). Available methods for reliability assessment can be categorized into two main groups: gradientbased and simulation-based methods [1]. The first group consist in an iterative minimization procedure based on the limit state function gradient estimation in order to find the design point, which is a point on the failure surface with the highest probability density, also denoted as the most likely failure point.…”
Section: Introductionmentioning
confidence: 99%
“…In addition to the approaches in which neural networks are directly applied to reliability problems, they are also applied within meta or surrogate models in which the analyses based on physical models are computationally expensive [26], [20], [45] so that neural networks learn to generalize the input-output mapping from few samples computed by the physical models. Furthermore, neural networks can also be applied in combination with statistical methods, e.g.…”
Section: Introductionmentioning
confidence: 99%
“…The above mentioned response surface methods differ in the choice of the interpolation polynomial and of the evaluation points and it has been shown [14] that the obtained results are highly dependent on these choices and on the shape of the actual failure hypersurface. According to Hurtado and Alvarez [15], this is due to the rigid and non-adaptive structure of the models implemented by response surface methods. Moreover, it is noted that global approximations are often constructed for parameter spaces that ignore constraints imposed by the physical nature of the problem.…”
Section: Introductionmentioning
confidence: 99%
“…Hyperplane approximation methods show a tendency to link importance sampling methods and response surfaces, but are limited by the necessity to find evaluation points that lie on the failure hypersurface and by gradient computations. Regarding neural networks, [15,22] report that radial basis functions perform better than backpropagation multilayer perceptrons, since the former can better represent local information. However, artificial neural networks need a properly chosen training set.…”
Section: Introductionmentioning
confidence: 99%