2002
DOI: 10.1023/a:1013662124879
|View full text |Cite
|
Sign up to set email alerts
|

Approximating Networks and Extended Ritz Method for the Solution of Functional Optimization Problems

Abstract: Functional optimization problems can be solved analytically only if special assumptions are verified; otherwise, approximations are needed. The approximate method that we propose is based on two steps. First, the decision functions are constrained to take on the structure of linear combinations of basis functions containing free parameters to be optimized (hence, this step can be considered as an extension to the Ritz method, for which fixed basis functions are used). Then, the functional optimization problem … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

1
128
0

Year Published

2003
2003
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 128 publications
(129 citation statements)
references
References 29 publications
1
128
0
Order By: Relevance
“…In the last decades, complex optimization problems of this kind have been approximately solved by searching suboptimal solutions over admissible sets of functions computable by neural networks [4], [21], [22], [25], [28], [29]. Neural networks can be studied in a more general context of variable-basis functions, which also include other nonlinear families of functions such as free-node splines and trigonometric polynomials with free frequencies [17].…”
mentioning
confidence: 99%
See 2 more Smart Citations
“…In the last decades, complex optimization problems of this kind have been approximately solved by searching suboptimal solutions over admissible sets of functions computable by neural networks [4], [21], [22], [25], [28], [29]. Neural networks can be studied in a more general context of variable-basis functions, which also include other nonlinear families of functions such as free-node splines and trigonometric polynomials with free frequencies [17].…”
mentioning
confidence: 99%
“…For example, when optimization is performed over linear combinations of fixed-basis functions, the number of basis functions required to guarantee a desired optimization accuracy may grow exponentially fast with the number of variables of admissible solutions [23, pp. 232-233], [29]. However, experience has shown that neural networks with a small number of computational units may perform well in optimization tasks where admissible solutions depend on a large number of variables [21], [22], [25], [28], [29].…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…The operating policy, defined as a nonlinear approximating network, is directly conditioned on observations of exogenous information, which cannot be accurately modeled and would produce detrimental effects on the performance of an operating policy conditioned on approximate model's outputs (Formentin et al, 2012). The selected policy parameterization strongly influences the selection of the optimization approach, as the number of parameters necessary to obtain a good approximation for the unknown optimal control policy grows with the increasing dimension of the policy's argument (Zoppoli et al, 2002). Since the optimization of the policy parameters requires searching high dimensional spaces that map to stochastic and multimodal objective function values, global optimization methods such as evolutionary algorithms are preferred to gradient-based methods (Heidrich-Meisner and Igel, 2008).…”
Section: Assessment Of the Operational Value Of Virtual Snow Indexesmentioning
confidence: 99%
“…Any subset of an algebra is called a subalgebra if itself is an algebra. 7 Note that a closed interval in R , as the one to which Weierstrass theorem makes reference, is of course a compact set:…”
Section: C(i) For Some Nonempty Compact Interval I ⊂ R Then For Evmentioning
confidence: 99%