2007
DOI: 10.1111/j.1365-246x.2007.03342.x
|View full text |Cite
|
Sign up to set email alerts
|

Neural network modelling and classification of lithofacies using well log data: a case study from KTB borehole site

Abstract: S U M M A R YA novel approach based on the concept of super self-adapting back propagation (SSABP) neural network has been developed for classifying lithofacies boundaries from well log data. The SSABP learning paradigm has been applied to constrain the lithofacies boundaries by parameterzing three sets of well log data, that is, density, neutron porosity and gamma ray obtained from the German Continental Deep Drilling Project (KTB). A multilayer perceptron (MLP) neural networks model was generated in a superv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
35
0
2

Year Published

2007
2007
2021
2021

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 85 publications
(38 citation statements)
references
References 35 publications
1
35
0
2
Order By: Relevance
“…They can be a priori given from manual analysis of the GR or shallow resistivity logs, determined by multivariate statistical methods, such as cluster analysis, fuzzy logic, and neural networks (Maiti et al, 2007), or they can be estimated automatically by interval inversion . In solving the inverse problem, we apply two different approaches.…”
Section: Inverse Problemmentioning
confidence: 99%
“…They can be a priori given from manual analysis of the GR or shallow resistivity logs, determined by multivariate statistical methods, such as cluster analysis, fuzzy logic, and neural networks (Maiti et al, 2007), or they can be estimated automatically by interval inversion . In solving the inverse problem, we apply two different approaches.…”
Section: Inverse Problemmentioning
confidence: 99%
“…The processing elements/nodes are interconnected layer by layer and the function of an each node is determined by the connections weights, biases and the structure of the network (Bishop, 1995;Poulton, 2001). Detailed sequential developments of the ANN methods are available elsewhere (Poulton, 2001;Maiti et al, 2007;Maiti and Tiwari, 2010b). In the popular back-propagation method, an error is usually minimized by adjusting weights and biases via the gradient based iterative chain rule from the output layer to the input layer (Rumelhart et al, 1986).…”
Section: Multi-layer Perceptrons (Mlp)mentioning
confidence: 99%
“…It is slow and often gets trapped in one of the many local minima of complex error surface. Consequently to evade this problem, many improvements have been put forward, such as the inclusion of momentum variable, adaptive learning rate (Poulton, 2001;Maiti et al, 2007), conjugate gradient (Bishop, 1995), scaled conjugate gradient (Bishop, 1995;Maiti and Tiwari, 2010b), Levenberg-Marquaard (Bishop, 1995;Poulton, 2001).…”
Section: R K Tiwari and S Maiti: Bayesian Neural Network Modelingmentioning
confidence: 99%
“…For example: (1) for seismic event classification (Dystart and Pulli, 1990), (2) well log analysis (Aristodemou et al, 2005;Maiti et al, 2007;Maiti and Tiwari, 2007, 2010b, (3) first arrival picking (Murat and Rudman, 1992), (4) earthquake prediction (Feng et al, 1997), (5) inversion (Raiche, 1991;Devilee et al, 1999), (6) parameter estimation in geophysics (Macias et al, 2000), (7) prediction of aquifer water level (Coppola Jr. et al, 2005), (8) magneto-telluric data inversion (Spichak and Popova, 2000), (9) magnetic interpretations (Bescoby et al, 2006), (10) signal discrimination (Maiti and Tiwari, 2010a), (11) modeling (Sri Lakshmi and Tiwari, 2009), (12) DC resistivity inversion (Qady and Ushijima, 2001;Lampinen and Vehtari, 2001;Singh et al, 2005Singh et al, , 2006Singh et al, , 2010.…”
Section: S Maiti Et Al: Inversion Of DC Resistivity Data Of Koyna Rmentioning
confidence: 99%