1997
DOI: 10.1002/(sici)1099-0887(199712)13:12<977::aid-cnm113>3.0.co;2-x
|View full text |Cite
|
Sign up to set email alerts
|

Sequential function approximation for the solution of differential equations

Abstract: SUMMARYA computational method for the solution of dierential equations is proposed. With this method an accurate approximation is built by incremental additions of optimal local basis functions. The parallel direct search software package (PDS), that supports parallel objective function evaluations, is used to solve the associated optimization problem eciently. The advantage of the method is that, although it resembles adaptive methods in computational mechanics, an a priori grid is not necessary. Moreover, th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2001
2001
2009
2009

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…The method of weighted residuals is a possible alternative, as discussed in Meade et al (1997). For example, minimizing R, R , where R is the residual of a differential equation and • is some appropriate inner product, corresponds to the well-known least-squares weighted residual method.…”
Section: Challenges and Opportunitiesmentioning
confidence: 99%
See 1 more Smart Citation
“…The method of weighted residuals is a possible alternative, as discussed in Meade et al (1997). For example, minimizing R, R , where R is the residual of a differential equation and • is some appropriate inner product, corresponds to the well-known least-squares weighted residual method.…”
Section: Challenges and Opportunitiesmentioning
confidence: 99%
“…Motivated by similarities observed in artificial neural network algorithms and adaptive grid optimization in computational mechanics, Meade et al (1997) formulated the concept of sequential function approximation (SFA) for the solution of differential equations. Using the closely related methods of weighted residuals and variational principles, appropriate optimization problems were formulated, and incrementally built solutions were obtained for one-and two-dimensional boundary value problems involving linear self-adjoint differential operators associated with homogeneous Dirichlet boundary conditions.…”
Section: Introductionmentioning
confidence: 99%
“…Generally, function approximation is performed on the final function with methods such as GLOMAP [3], Sequential Function Approximation (SFA) [4], [5], least-squares, etc. Other methods have been developed recently that actually learn using a least-squares approximation.…”
Section: Introductionmentioning
confidence: 99%