2014
DOI: 10.1016/j.asoc.2014.04.012
|View full text |Cite
|
Sign up to set email alerts
|

Robust feedforward and recurrent neural network based dynamic weighted combination models for software reliability prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
28
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 52 publications
(28 citation statements)
references
References 24 publications
0
28
0
Order By: Relevance
“…That is why many researchers have focused on non‐PSRGMs. These models generally use different machine learning approaches such as neural networks and support vector machine . Neural network models have better robustness, but the convergence of the training process is slow and can easily fall into local minima, which cannot guarantee best solution.…”
Section: Related Workmentioning
confidence: 99%
“…That is why many researchers have focused on non‐PSRGMs. These models generally use different machine learning approaches such as neural networks and support vector machine . Neural network models have better robustness, but the convergence of the training process is slow and can easily fall into local minima, which cannot guarantee best solution.…”
Section: Related Workmentioning
confidence: 99%
“…Besides, the performance of dynamic neural networks is better than of static neural networks and dynamic networks can effectively extract the dynamic characteristics of systems [26]. Lately, RNNs have drawn much attention for getting dynamic features of systems [27], [28], [29], [30]. Because of their powerful characteristics, RNNs have been efficiently used for a wide range of problems such as time series predicting [31], [32], [33].…”
Section: Related Workmentioning
confidence: 99%
“…In [1], a robust ANN based on the median neuron model was proposed. Moreover, a robust, feed forward, and recurrent neural network model was presented in [17].…”
Section: Introductionmentioning
confidence: 99%