1995
DOI: 10.1080/07350015.1995.10524579
|View full text |Cite
|
Sign up to set email alerts
|

Genetic Algorithms for Estimation Problems With Multiple Optima, Nondifferentiability, and Other Irregular Features

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
29
0

Year Published

2003
2003
2020
2020

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 121 publications
(29 citation statements)
references
References 20 publications
0
29
0
Order By: Relevance
“…Instead, they are calibrated jointly to fit a set of data moments of equal size. This fit is very nonlinear in the parameters; so a genetic algorithm following Dorsey and Mayer (1995) is used to find the best fit. It turns out to be useful to calibrate c e and c f as the ratio c e /c f and the level c f .…”
Section: Calibrationmentioning
confidence: 99%
“…Instead, they are calibrated jointly to fit a set of data moments of equal size. This fit is very nonlinear in the parameters; so a genetic algorithm following Dorsey and Mayer (1995) is used to find the best fit. It turns out to be useful to calibrate c e and c f as the ratio c e /c f and the level c f .…”
Section: Calibrationmentioning
confidence: 99%
“…[11] Equation (5) describes the surface SP anomaly induced by a pumping test at rate Q in an unconfined homogeneous aquifer of hydraulic conductivity K, electrokinetic coupling parameter L and bedrock depth Z. Therefore, knowing the hydraulic head h 0 in the vicinity of the well (at r 0 ) and the surface SP anomaly during a pumping test of flow rate Q, we should be able to determine Z, K, and L. We used a genetic algorithm [Dorsey and Mayer, 1995] to find the best {Z, K, L} that minimizes a weighted quadratic error function f between the predicted and observed SP data.…”
Section: Inversion Of Sp Datamentioning
confidence: 99%
“…As a first step, the MGA procedure was used to train the NN on each of the 30 datasets. The training began after setting the parameters to values recommended by Dorsey and Mayer (1995) and , and after including the additional modifications. After training, the network's performance (as a function of error rates) was compared to a benchmark NN-the categorical learning network, or CATLRN, identified in earlier studies as a reliable performer (see Etheridge and Sriram, 1997).…”
Section: Data Experiment and The Resultsmentioning
confidence: 99%
“…The GA is a global search method that searches from one population of solutions to another while comparing the newly generated solution to the best solution obtained from the earlier search. It works well when finding solutions to functions with complex nonlinear relationships (Dorsey, Johnson, & Mayer, 1994;Dorsey, Johnson, & Van Boening, 1994;Dorsey & Mayer, 1995;Sexton, Alidaee, Dorsey, & Johnson, 1998;Sexton & Dorsey, 2000). A researcher can use the GA to improve the performance of a backpropagation NN or even as an alternative to backpropagation (Schaffer, Whitley, & Eshelman, 1992).…”
Section: The Modified Genetic Algorithm (Mga)mentioning
confidence: 99%