This paper proposes the Bayesian Extreme Learning Machine Kohonen Network (BELMKN) framework to solve the clustering problem. The BELMKN framework uses three levels in processing nonlinearly separable datasets to obtain efficient clustering in terms of accuracy. In the first level, the Extreme Learning Machine (ELM)-based feature learning approach captures the nonlinearity in the data distribution by mapping it onto a d-dimensional space. In the second level, ELM-based feature extracted data is used as an input for Bayesian Information Criterion (BIC) to predict the number of clusters termed as a cluster prediction. In the final level, feature-extracted data along with the cluster prediction is passed to the Kohonen Network to obtain improved clustering accuracy. The main advantage of the proposed method is to overcome the problem of having a priori identifiers or class labels for the data; it is difficult to obtain labels in most of the cases for the real world datasets. The BELMKN framework is applied to 3 synthetic datasets and 10 benchmark datasets from the UCI machine learning repository and compared with the state-of-the-art clustering methods. The experimental results show that the proposed BELMKN-based clustering outperforms other clustering algorithms for the majority of the datasets. Hence, the BELMKN framework can be used to improve the clustering accuracy of the nonlinearly separable datasets.
Historically the pioneers in Permanent Downhole Gauge (PDG) deployment were large operators targeting real time reservoir surveillance on a few high profile wells or field development projects. With time, utilization of PDG became an industry standard, unit prices went down and reliability has significantly increased, making application of PDGs a wide spread phenomenon. The interest in PDG data goes beyond simply recording pressure and temperature at any given time for reservoir surveillance and monitoring. The combination of the well production and PDG acquired pressure data proves to be an ideal dataset for reservoir characterization and improved production optimization. This paper presents a dynamic real-time well testing workflow used in analyzing PDG data for Pressure Transient Analysis on a complex reservoir geometry offshore Nigeria. It discusses the extent and applicability of Pressure Transient Analysis (PTA) using Permanent Downhole Gauge data for complex reservoir and well performance characterization. In analyzing such PDG data, it was important to correctly identify anomalies which could hamper the interpretation of the complex dataset, such as tidal effects. The presence of these anomalies can significantly impact data analysis, resulting in wrong flow regimes identification and erroneous well and reservoir parameters estimation. Therefore, tidal signal and other noises had to be identified and removed from the pressure data before the interpretation. Commercially available software was used for de-noising and managing the real-time measurements, as well as modeling and analysis of the test data. Actual real-time pressure from PDG and production data from surface well test measurements were used in the analysis to determine reservoir parameters and evaluate well performance. This paper elaborates on the proper techniques for data handling of offshore acquired PDG measurements for Pressure Transient Analysis. The workflow resulted in significant cost savings due to a reduction in the number of shut-ins planned consequently.
Over the years, well test analysis or pressure transient analysis (PTA) methods have progressed from straight lines via type curve analysis to pressure derivatives and deconvolution methods. Today, analysis of the log-log (pressure and its derivative) response is the most used method for PTA. Although these methods are widely available through commercial software, they are not fully automated, and human interaction is needed for their application. Furthermore, PTA is described as an inverse problem, whose solution in general is non-unique, and several models (well, reservoir and boundary) can be found applicable to similar pressure-derivative response. This tends to always bring about confusion in choosing the correct model using the conventional approach. This results in multiple iterations that are time consuming and requires constant human interaction. Our approach automates the process of PTA using a Siamese neural network (SNN) architecture comprised of Convolutional neural network (CNN) and Long Short-Term Memory (LSTM) layers. The SNN model is trained on simulated experimental data created using a design of experiments (DOE) approach involving most common 14 interpretation scenarios across well, reservoir, and boundary model types. Across each model type, parameters such as permeability, horizontal well length, skin factor, and distance to the boundary were sampled to compute 560 different pressure derivative responses. SNN is trained using a self-supervised training strategy where the positive and negative pairs are generated from the training data. We use transformations such as compression and expansion to generate positive pairs and negative pairs for the well test model responses. For a given well test model response, similarity scores are computed against the candidates in each model class, and the best match from each class is identified. These matches are then ranked according to the similarity scores to identify optimal candidates. Experimental analysis indicated that the true model class frequently appeared among the top ranked classes. The model achieves an accuracy of 93% for the top one model recommendations when tested on 70 samples from the 14 interpretation scenarios. Prior information on the top ranked probable well test models, significantly reduces the manual effort involved in the analysis. This machine learning (ML) approach can be integrated with any PTA software or function as a standalone application in the interpreter's system. Current work using SNN with LSTM layers can be used to speed up the process of detecting the pressure derivative response explained by a certain combination of well, reservoir and boundary models and produce models with less user interaction. This methodology will facilitate the interpretation engineer in making the model recognition faster for detailed integration with additional information from sources such as geophysics, geology, petrophysics, drilling, and production logging.
Vertical Interference Tests (VIT) using wireline formation testers are industry standard tests to estimate the vertical permeability of reservoir pay zones. In general, the test interval is considered homogeneous for the interpretation, leading to an inaccurate estimation of vertical permeability (Kv) in complex geological systems like thin laminated beds, channel sands, etc. This paper presents a novel approach of accounting for this heterogeneity through use of petrophysical and borehole image-based rock-typing methods, thus leading to a more realistic characterization of vertical permeability. Advanced petrophysical logs and images are used to generate rock types through Artificial Neural Network (ANN) and Stratigraphic Modified Lorentz Plot (SMLP) techniques. These rock types are then used as an input into vertical interference test interpretation model, thereby factoring in the reservoir heterogeneity for deriving the vertical permeability. This is followed by a sensitivity analysis to examine the impact of the permeability results in multiple geological systems like channel sands, thin bed lamination, near to fault, pinch outs etc. Vertical permeability (Kv) is a major input in majority of the advanced reservoir engineering calculations and has a significant impact on the field development plan and IOR/EOR techniques. This unique approach of accounting for rock types in a VIT interpretation model leads to a relatively good estimation of vertical permeability. The rock typing techniques used here, allow the user to define the number of layers and minimum interval thickness, which is extremely useful in highly laminated reservoirs. The sensitivity analysis plays a key role in understanding the utility limitations of both conventional and new approach in complex geological systems. In case of thick homogeneous reservoir, sand units, the conventional approach could be used with fairly accurate results. However, in cases of thin sand-shale units with low net-to-gross ratio, this approach gives a good estimation of layer-wise permeability distribution. This paper presents a unique blend of petrophysical and dynamic workflows into a novel workflow. The results from the sensitivity study, discussed in the paper, can be used as standard criteria in determining the best suitable technique for interpretation of a vertical interference test. This unique approach allows the user to optimize on the interpretation time and to simultaneously ensure the accuracy of results.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.