2004
DOI: 10.1111/j.1365-2478.2005.00432.x
|View full text |Cite
|
Sign up to set email alerts
|

Inversion of nuclear well‐logging data using neural networks

Abstract: A B S T R A C TThis work looks at the application of neural networks in geophysical well-logging problems and specifically their utilization for inversion of nuclear downhole data. Simulated neutron and γ -ray fluxes at a given detector location within a neutron logging tool were inverted to obtain formation properties such as porosity, salinity and oil/water saturation. To achieve this, the forward particle-radiation transport problem was first solved for different energy groups (47 neutron groups and 20 γ -r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
31
0

Year Published

2007
2007
2021
2021

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 31 publications
(32 citation statements)
references
References 30 publications
1
31
0
Order By: Relevance
“…In BP algorithms, the learning rate, H, is a small number (0.1 < H < 1.0) (Aristodemou et al, 2005) that controls the amount of error that will be negatively added to the interconnection weights for the next iteration (Cranganu, 2007). If the learning rate is large, then large changes are allowed in the weight changes, and no learning occurs.…”
Section: S E T T I N G T H E L E a R N I N G R A T E A N D M O M mentioning
confidence: 99%
“…In BP algorithms, the learning rate, H, is a small number (0.1 < H < 1.0) (Aristodemou et al, 2005) that controls the amount of error that will be negatively added to the interconnection weights for the next iteration (Cranganu, 2007). If the learning rate is large, then large changes are allowed in the weight changes, and no learning occurs.…”
Section: S E T T I N G T H E L E a R N I N G R A T E A N D M O M mentioning
confidence: 99%
“…2) (i) a set of nodes (artificial neurons) where the nodes perform simple computations, (ii) a set of interconnections or synapses linking pair of nodes, and (iii) a set of labels known as weights, associated with each interconnection and identifying some property of interconnection. These weights correspond to the synaptic efficiency of the biological neurons (ARISTODEMOU et al, 2005).…”
Section: Figurementioning
confidence: 99%
“…2), whereas a corresponding output response is presented to the last layer of processing elements. All connections between the input layer, the ''hidden'' layers, and the output layer are then adjusted using an objective function (or ''cost function'') such as the mean-squared error (MSE) (ARBOGAST et al, 2000), the sum-squared error (SME) (ARISTODEMOU et al, 2005) or the global error E (LUHTI and BRYANT, 1997). For example, the global error E is computed as the sum of the squared differences between the computed output y k and the desired output d k (i.e., the local errors).…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…For example: (1) for seismic event classification (Dystart and Pulli, 1990), (2) well log analysis (Aristodemou et al, 2005;Maiti et al, 2007;Maiti and Tiwari, 2007, 2010b, (3) first arrival picking (Murat and Rudman, 1992), (4) earthquake prediction (Feng et al, 1997), (5) inversion (Raiche, 1991;Devilee et al, 1999), (6) parameter estimation in geophysics (Macias et al, 2000), (7) prediction of aquifer water level (Coppola Jr. et al, 2005), (8) magneto-telluric data inversion (Spichak and Popova, 2000), (9) magnetic interpretations (Bescoby et al, 2006), (10) signal discrimination (Maiti and Tiwari, 2010a), (11) modeling (Sri Lakshmi and Tiwari, 2009), (12) DC resistivity inversion (Qady and Ushijima, 2001;Lampinen and Vehtari, 2001;Singh et al, 2005Singh et al, , 2006Singh et al, , 2010.…”
Section: S Maiti Et Al: Inversion Of DC Resistivity Data Of Koyna Rmentioning
confidence: 99%