Over the last 3 years, Eni has developed an integrated platform to gather subsurface data and make them available to the final users across the company. The Platform is structured in four data domains including the well data, which is the focus of this abstract. In the data model, the well master and architectural data assume paramount importance since they are upstream of the value chain and represent the aggregator of all the data recorded in the well. A large amount of data coming from all Eni Affiliates and operative sites is produced daily. To gather, perform Quality Controls and ingest them, Eni has implemented a governed workflow to ensure data is made available to the final users through the platform in an efficient and transparent way. The workflow aims to ingest the well master and well geometrical data directly from the well site defining roles all along the data transmission chain, with the ultimate objective to ensure the proper data quality once they reach the cross-functional subsurface data platform. It is no less important the timely availability of such data to guarantee a prompt association with geological and drilling log data. Given the increased amount of data acquired from disparate data sources, different functions and locations, Business Intelligence tools have been designed to monitor the workflow combining data for an easier data insight. To manage such complex network, dedicated dashboards allow all the users involved to visualize the status of the processes through specific KPIs, thus optimizing communication and reducing the human effort required. The capability to manage, validate and quickly interpret data will determine the competitive advantage among Energy Companies in the next future. Eni targets to excel in the everchanging business environment leveraging the new century asset: the data. The presented approach, combined with the diffused data culture initiatives, promotes a collaborative environment, and increase awareness on data importance across the value chain.
The explorative targets of the proposed case history are the Cretaceous sequences in a ENE-WSW fold related to the "Syrian Arc" event, and the overlying Oligocene to Miocene sequences that pertain to the Nile Delta System. The structural pattern is sealed by the evaporites associated with the Messinian Salinity Crisis. When facing such complexity the integration between interpreters and geophysicists is the key point in order to obtain a reliable imaging and hence a reliable geological model. Moreover, an integrated working platform is required to speedup the process that requires strong team interaction and a certain number of iterations to achieve the desired degree of confidence. The workflow started from the estimation of the velocity in the post-Rosetta where a global grid tomographic approach was used. Then for the Rosetta we initially assumed a tentative uniform velocity that resulted in some push-down and pull-up effects. Such effects were removed after geophysical and geological evidences provided information about the real geometries of Rosetta, allowing to tune velocity variations within salt. In the deepest part, the velocity analysis was even more complex because of the limited offset/depth (6000m/8000m) ratio. A modeling exercise gave confidence on the effective illumination of the deeper reflectors and a large scale grid tomography was performed. The final velocities in the pre-Rosetta showed two significant velocity inversions validated by geological models and analogues. The proposed workflow led to significant improvements of the imaging of pre Rosetta sequences, both in terms of SNR and geological reliability of the prospect structure with a consequent de-risking of the prospects. R. Marten, M. Shann, J. Mika, S. Rothe, and Y. Quist, [2004] Seismic challenges of developing the pre-Pliocene Akhen Field offshore Nile Delta, The Leading Edge a c b
Application of anisotropy has proved to be mandatory for the improvement of imaging quality, at the point that nowadays the introduction of TTI anisotropy has become a standard for PSDM projects. These advancements pose new challenges to migration velocity analysis, and the time domain approaches commonly used for the estimation of anisotropic velocity parameters are no longer enough to satisfy imaging accuracy requirements. A depth-domain estimation technique is proposed, which is completely based on PSDM and on the classic CIG (Common Image Gather) flattening principle. One of the key aspects is the use of a robust automatic non-hyperbolic moveout picking algorithm, which is applied on the depth migrated CIGs and provides the correct description of the complete residual moveout: this allows the joint tomographic inversion of two anisotropic velocity volumes, which are able to properly account for the non-hyperbolic residual moveout behavior at both short and long offsets. The method has general validity and it can be applied to any imaging project; it is also particularly stable when applied to multi-azimuth acquisitions. This approach can obtain a good anisotropic focusing velocity if the data contains sufficiently large offsets, although still one or more wells are needed to constrain vertical velocity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.