2021
DOI: 10.3390/jmse9080888
|View full text |Cite
|
Sign up to set email alerts
|

Classification of Reservoir Recovery Factor for Oil and Gas Reservoirs: A Multi-Objective Feature Selection Approach

Abstract: The accurate classification of reservoir recovery factor is dampened by irregularities such as noisy and high-dimensional features associated with the reservoir measurements or characterization. These irregularities, especially a larger number of features, make it difficult to perform accurate classification of reservoir recovery factor, as the generated reservoir features are usually heterogeneous. Consequently, it is imperative to select relevant reservoir features while preserving or amplifying reservoir re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 41 publications
0
9
0
Order By: Relevance
“…ANN classifiers were used in this study that will first be introduced with GWO. He also implements the proposed method in oil and gas datasets [7]. As a result, it obtains good results compared with MOPSO and NSGA-II methods.…”
Section: Feature Selectionmentioning
confidence: 96%
See 1 more Smart Citation
“…ANN classifiers were used in this study that will first be introduced with GWO. He also implements the proposed method in oil and gas datasets [7]. As a result, it obtains good results compared with MOPSO and NSGA-II methods.…”
Section: Feature Selectionmentioning
confidence: 96%
“…It is less efficient, easily traps local optima, and has scalability issues when dealing with high-dimensional datasets. Many studies, such as [7][8] [9], demonstrated that metaheuristic outperform the traditional method. The embedded model has overcome the limitation of the filter and wrapper model by executing it before creating the classifier.…”
Section: Feature Selectionmentioning
confidence: 99%
“…The standard NSGA algorithm has faced criticism for its high computational cost in non-dominant sorting, Absence of elitism, and Necessity for Specifying the Sharing Parameter, that explains why we're adopting NSGA-II to overcome above aforementioned limitations. The most well-known multiobjective optimization algorithm, NSGA-II, is characterised by three key characteristics: a quick nondominated sorting technique, a quick estimation of the crowded distance parameter, and a quick comparison operator [16]. According to Babajamali et al [17], the NSGA-II optimization approach performed much better than PAES and SPEA in specifying a larger variety of approaches to several test issues from previous research.…”
Section: Non-dominated Sorted Gentic Algorithm (Nsga-ii) Optimizationmentioning
confidence: 99%
“…In (6), the updated dimension of a candidate solution is assigned the same value as the global best value if the updated position goes out of boundaries.…”
Section: Proposed Algorithmmentioning
confidence: 99%
“…To overcome the limitations of the conventional approaches, meta-heuristic algorithms, as a stochastic approach, can be used to solve complicated real-world optimization problems. Meta-heuristic algorithms have shown robust performance when applied to different optimization problems in various fields such as wireless communications [3][4][5] and artificial intelligence [6][7][8].…”
Section: Introductionmentioning
confidence: 99%