Proceedings of the Princeton Symposium on Mathematical Programming 2015
DOI: 10.1515/9781400869930-007
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic Geometric Programming

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 11 publications
(13 citation statements)
references
References 11 publications
0
13
0
Order By: Relevance
“…We consider widely adopted gradient descent (GD) algorithm (Avriel, 2003) in full-graph GNN training. Similar to previous studies (Fu et al, 2020;Chen et al, 2020), the fullgraph training can be modeled as the following non-convex empirical risk minimization problem:…”
Section: Impact Of Message Quantization On Training Convergencementioning
confidence: 99%
“…We consider widely adopted gradient descent (GD) algorithm (Avriel, 2003) in full-graph GNN training. Similar to previous studies (Fu et al, 2020;Chen et al, 2020), the fullgraph training can be modeled as the following non-convex empirical risk minimization problem:…”
Section: Impact Of Message Quantization On Training Convergencementioning
confidence: 99%
“…Next, the component goodness values are calculated such that they maximize Pr((A, e)|d k ). This is calculated by applying a gradient ascent procedure [9] (lines 9-11).…”
Section: Algorithmmentioning
confidence: 99%
“…It should be noted that the authors, Khanjani, Tavana, DiCaprio, and Fukuyama reference Baoding Liu's extensive work [6] on uncertainty which they use to derive deterministic equivalents for chance-constrained geometric programs in the form of chance constraints, see for example, [7][8][9][10][11][12][13]. Theirs [1] is the only application of which we are aware for uncertain geometric programming models whose deterministic equivalents are standard posynomial geometric programs.…”
Section: Introductionmentioning
confidence: 99%