2019
DOI: 10.48550/arxiv.1904.12952
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

New optimization algorithms for neural network training using operator splitting techniques

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
13
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(13 citation statements)
references
References 0 publications
0
13
0
Order By: Relevance
“…For the (HBF γ ) the objective function f does not decrease along the trajectories. On the other hand, the global energy (kinetic + potential) E (t) = 1 2 ẋ(t) 2 + f (x(t)) is monotone decreasing, by being a continuous Lyapunov-type functional. Furthermore, we remind that Attouch.…”
Section: Preliminariesmentioning
confidence: 99%
See 4 more Smart Citations
“…For the (HBF γ ) the objective function f does not decrease along the trajectories. On the other hand, the global energy (kinetic + potential) E (t) = 1 2 ẋ(t) 2 + f (x(t)) is monotone decreasing, by being a continuous Lyapunov-type functional. Furthermore, we remind that Attouch.…”
Section: Preliminariesmentioning
confidence: 99%
“…If the vector field from the right hand side can be decomposed into two vector fields, i.e. F = F [1] + F [2] , then we attach the associated sub-problems of (5), namely [1] (ϕ(t)) and (P 2 ) : d dt ϕ(t) = F [2] (ϕ(t))…”
Section: Some Notes On Splitting Operator Techniquesmentioning
confidence: 99%
See 3 more Smart Citations