2022
DOI: 10.1007/s10596-022-10135-9
|View full text |Cite
|
Sign up to set email alerts
|

An adaptive moment estimation framework for well placement optimization

Abstract: In this study, we propose the use of a first-order gradient framework, the adaptive moment estimation (Adam), in conjunction with a stochastic gradient approximation, to well location and trajectory optimization problems. The Adam framework allows the incorporation of additional information from previous gradients to calculate variable-specific progression steps. As a result, this assists the search progression to be adjusted further for each variable and allows a convergence speed-up in problems where the gra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0
1

Year Published

2022
2022
2025
2025

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(4 citation statements)
references
References 37 publications
(55 reference statements)
0
3
0
1
Order By: Relevance
“…Hijry and Olawoyin [27] proposed a solution that studies deep learning techniques for historical queueing variables that will be used in addition to, or instead of, queueing theory to anticipate patient waiting times in a system (QT). They employed four optimization strategies, including Stochastic Gradient Descent (SGD) [28], adaptive moment estimation (Adam) [29], [30] Root Mean Square Propagation (RMSprop), and Adaptive Gradient (AdaGrad) [31], The model is evaluated using the mean absolute error (MAE) method. A conventional mathematical simulation is adopted for further analysis.…”
Section: ) Comparison Of Dt Rf and Xgboostmentioning
confidence: 99%
“…Hijry and Olawoyin [27] proposed a solution that studies deep learning techniques for historical queueing variables that will be used in addition to, or instead of, queueing theory to anticipate patient waiting times in a system (QT). They employed four optimization strategies, including Stochastic Gradient Descent (SGD) [28], adaptive moment estimation (Adam) [29], [30] Root Mean Square Propagation (RMSprop), and Adaptive Gradient (AdaGrad) [31], The model is evaluated using the mean absolute error (MAE) method. A conventional mathematical simulation is adopted for further analysis.…”
Section: ) Comparison Of Dt Rf and Xgboostmentioning
confidence: 99%
“…Applying some form of momentum to reservoir management optimization is not entirely untested. Arouri and Sayyafzadeh 14 applied the Adam method together with the SPSA (Simultaneous Perturbation Stochastic Approximation) gradient to form Adam-SPSA, which was then applied to the problem of placing production wells, and later to well control optimization 15 . Both of the references report significant improvements in terms of stability and final objective function value.…”
Section: Ensemble Optimization With Momentummentioning
confidence: 99%
“…Adam [33], [34] merupakan algoritma gabungan dari algoritma RMSProp dan Momentum dengan menyimpan learning rate RMSProp dan weight dari rerata momentum [35], [36].…”
Section: 2 Adaptive Moment Estimation (Adam)unclassified