2019
DOI: 10.1007/978-3-030-16841-4_1
|View full text |Cite
|
Sign up to set email alerts
|

On the Trade-Off Between Number of Examples and Precision of Supervision in Regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 4 publications
0
2
0
Order By: Relevance
“…This possibility is particularly important when there are upper bounds on the available total computational time. It is worth mentioning that these arguments have been proved in [48] for the related problem of regression, whose investigation can be considered as a preliminary step for the analysis of surrogate optimization (indeed, it involves only the construction of the approximating function, but not its successive optimization). Possible choices for the iterative optimization algorithm, examined in this paper are: a) Method of Moving Asymptotes (MMA) [33] and its Globally Convergent upgrade (GCMMA) [34].…”
Section: Appendix C: Iterative Optimization Algorithmsmentioning
confidence: 92%
See 1 more Smart Citation
“…This possibility is particularly important when there are upper bounds on the available total computational time. It is worth mentioning that these arguments have been proved in [48] for the related problem of regression, whose investigation can be considered as a preliminary step for the analysis of surrogate optimization (indeed, it involves only the construction of the approximating function, but not its successive optimization). Possible choices for the iterative optimization algorithm, examined in this paper are: a) Method of Moving Asymptotes (MMA) [33] and its Globally Convergent upgrade (GCMMA) [34].…”
Section: Appendix C: Iterative Optimization Algorithmsmentioning
confidence: 92%
“…In that context, Quasi-Monte Carlo sequences are often preferred to realizations of Monte Carlo ones, because the former are typically able to cover the domain Ω in a more uniform way than the latter (see, e.g., [17] for an illustrative comparison). In other words, points generated from Quasi-Monte It is worth mentioning that these arguments have been proved in [48] for the related problem of regression, whose investigation can be considered as a preliminary step for the analysis of surrogate optimization (indeed, it involves only the construction of the approximating function, but not its successive optimization). Possible choices for the iterative optimization algorithm, examined in this paper are: a) Method of Moving Asymptotes (MMA) [33] and its Globally Convergent upgrade (GCMMA) [34].…”
Section: Appendix B: Surrogate Optimizationmentioning
confidence: 96%