2015
DOI: 10.48550/arxiv.1502.02846
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Probabilistic Line Searches for Stochastic Optimization

Abstract: In deterministic optimization, line searches are a standard tool ensuring stability and efficiency. Where only stochastic gradients are available, no direct equivalent has so far been formulated, because uncertain gradients do not allow for a strict sequence of decisions collapsing the search space. We construct a probabilistic line search by combining the structure of existing deterministic methods with notions from Bayesian optimization. Our method retains a Gaussian process surrogate of the univariate optim… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2016
2016
2016
2016

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 14 publications
0
1
0
Order By: Relevance
“…[22] suggests a line search technique on the component function f i k (x) selected in each iteration, to estimate step size for SAG. [13] suggests performing line search for an estimated function, which is evaluated by a Gaussian process with samples f it (x t ). [14] suggests to generate the step sizes by a given function with an unknown parameter, and to use the online SGD to update this unknown parameter.…”
Section: Introductionmentioning
confidence: 99%
“…[22] suggests a line search technique on the component function f i k (x) selected in each iteration, to estimate step size for SAG. [13] suggests performing line search for an estimated function, which is evaluated by a Gaussian process with samples f it (x t ). [14] suggests to generate the step sizes by a given function with an unknown parameter, and to use the online SGD to update this unknown parameter.…”
Section: Introductionmentioning
confidence: 99%