1993
DOI: 10.1137/0614023
|View full text |Cite
|
Sign up to set email alerts
|

Optimization by Direct Search in Matrix Computations

Abstract: of algorithms it is shown that direct search is capable of revealing instability or poor performance, even when such failure is difficult to discover using theoretical analysis or numerical tests with random or nonrandom data. Informative numerical examples generated by direct search provide the impetus for further analysis and improvement of an algorithm. The direct search methods used are the method of alternating directions and the multi-directional search method of Dennis and Torczon. The problems exami… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
34
0

Year Published

1997
1997
2014
2014

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 51 publications
(34 citation statements)
references
References 46 publications
0
34
0
Order By: Relevance
“…LISREL 8.7 was used to run SEM. It used an iterative maximum likelihood algorithm to calculate path coefficients and to achieve the best match between the covariance matrix reproduced by the model and the observed variance-covariance structure from the data (Higham, 1993;Jö reskog and Sorbom, 1996). Derived from the logarithmic expression of a likelihood ratio test, the maximum likelihood (ML) discrepancy function was introduced Jö reskog, 1967) to indicate the fit of the model.…”
Section: Semmentioning
confidence: 99%
“…LISREL 8.7 was used to run SEM. It used an iterative maximum likelihood algorithm to calculate path coefficients and to achieve the best match between the covariance matrix reproduced by the model and the observed variance-covariance structure from the data (Higham, 1993;Jö reskog and Sorbom, 1996). Derived from the logarithmic expression of a likelihood ratio test, the maximum likelihood (ML) discrepancy function was introduced Jö reskog, 1967) to indicate the fit of the model.…”
Section: Semmentioning
confidence: 99%
“…Another example arises in the tuning of algorithmic parameters for a given method/code (see [6] for instances where DSM have been applied to solve such problems) -the resulting objective functions are likely to exhibit all sorts of discontinuities given the way that typically a method/code responds to changes in its parameters. DSM have also been used for automatic error analysis [12,13], a process in which the computer is used to analyze the accuracy or stability of a numerical computation (and examples have been provided where the objective function is discontinuous). Many engineering design problems (which are likely to form the core of the mainstream applications of derivative-free optimization) lead to objective functions involving discontinuities and limited or no access to derivatives (one such application in aircraft design which was recently drawn to our attention is reported in [3]).…”
Section: Introductionmentioning
confidence: 99%
“…In some experiments on direct search in Higham (1993) we used Matlab 3.5, and we found it much easier to generate counterexamples to rcond (examples in which rcond provides an estimate that is much too large) than we do now with Matlab 4.2. It seems that the maximizations in Higham (1993) were not only defeating the algorithm underlying rcond, but also, unbeknown to us, exploiting a bug in the implementation of the function. The conclusions of Higham (1993) are unaffected, however.…”
Section: Matlab's Rcondmentioning
confidence: 94%
“…It seems that the maximizations in Higham (1993) were not only defeating the algorithm underlying rcond, but also, unbeknown to us, exploiting a bug in the implementation of the function. The conclusions of Higham (1993) are unaffected, however.…”
Section: Matlab's Rcondmentioning
confidence: 99%
See 1 more Smart Citation