2021
DOI: 10.1016/j.petrol.2021.108678
|View full text |Cite
|
Sign up to set email alerts
|

Prediction of reservoir saturation field in high water cut stage by bore-ground electromagnetic method based on machine learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2025
2025

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 17 publications
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…As such computation is simple and easily implemented, some researchers employ KNN to realize the data-driven petrophysical characterization and, finally according to the analysis of validated results, confirm the effectiveness of KNN on the prediction of reservoir parameters [23][24][25]. Since KNN is featured by a lazy learning which means all learning samples will be scanned to search out the required neighbors for each test sample, its prediction of a test dataset with a large volume will cause a serious time-consuming phenomenon, and then "KD-tree" or "Ball-tree," which will assist KNN to form a presearching path of neighbors, is commonly used in practical case [23,24]. However, even employing such tree-based pretraining, KNN still will be low-efficient in the prediction, because to obtain a stable input-output mapping, a large-volumetric learning dataset is usually required, while training more learning samples inevitably will decelerate the speed of construction and query of "KD-tree" or "Ball-tree."…”
Section: Introductionmentioning
confidence: 90%
See 1 more Smart Citation
“…As such computation is simple and easily implemented, some researchers employ KNN to realize the data-driven petrophysical characterization and, finally according to the analysis of validated results, confirm the effectiveness of KNN on the prediction of reservoir parameters [23][24][25]. Since KNN is featured by a lazy learning which means all learning samples will be scanned to search out the required neighbors for each test sample, its prediction of a test dataset with a large volume will cause a serious time-consuming phenomenon, and then "KD-tree" or "Ball-tree," which will assist KNN to form a presearching path of neighbors, is commonly used in practical case [23,24]. However, even employing such tree-based pretraining, KNN still will be low-efficient in the prediction, because to obtain a stable input-output mapping, a large-volumetric learning dataset is usually required, while training more learning samples inevitably will decelerate the speed of construction and query of "KD-tree" or "Ball-tree."…”
Section: Introductionmentioning
confidence: 90%
“…KNN primarily utilizes several learning neighbors closer to the test sample to generate an approximate regression [22]. As such computation is simple and easily implemented, some researchers employ KNN to realize the data-driven petrophysical characterization and, finally according to the analysis of validated results, confirm the effectiveness of KNN on the prediction of reservoir parameters [23][24][25]. Since KNN is featured by a lazy learning which means all learning samples will be scanned to search out the required neighbors for each test sample, its prediction of a test dataset with a large volume will cause a serious time-consuming phenomenon, and then "KD-tree" or "Ball-tree," which will assist KNN to form a presearching path of neighbors, is commonly used in practical case [23,24].…”
Section: Introductionmentioning
confidence: 99%