Objectives/Scope: Please list the objectives and/or scope of the proposed paper. (25–75 words) The intent of this paper is to recognize and address the challenges that downstream companies face across their value chain including a history of having siloed businesses processes that have created ‘value leaks’. We propose an unified operations management approach that orchestrates all activities across the value chain as part of an enterprise digital transformation strategy. Business process can be deeply transformed when operations, supply chain and process optimization are connected in a collaborative environment. Methods, Procedures, Process: Briefly explain your overall approach, including your methods, procedures and process. (75–100 words) Refineries are at different stages in their digital transformation journey and have built their business workflows over many years across multiple point solutions. We propose a digital transformation strategy for operations across five key characteristics, that is: Results, Observations, Conclusions: Please describe the results, observations and conclusions of the proposed paper. (100–200 words) Across a refinery's value chain there are significant benefits that can be realized by approaching digital transformation across their operations enabling end-to-end value chain optimization including: Novel/Additive Information: Please explain how this paper will present novel (new) or additive information to the existing body of literature that can be of benefit to and/or add to the state of knowledge in the petroleum industry. (25–75 words) This paper is novel as it takes a wholistic look at a refinery operations value chain and eliminating existing value leaks end-to-end. It recognizes how existing systems have been built-up on outdated technology and siloed business process and proposes a path forward bringing operations, supply chain and process optimization together as a key element of a digital transformation strategy. The paper also explores how AI & prescriptive models pave the future for optimization.
Drilling data quality is notoriously a challenge for any analytics application, due to complexity of the real-time data acquisition system which routinely generates: (i) Time related issues caused by irregular sampling, (ii) Channel related issues in terms of non-uniform names and units, missing or wrong values, and (iii) Depth related issues caused block position resets, and depth compensation (for floating rigs). On the other hand, artificial intelligence drilling applications typically require a consistent stream of high-quality data as an input for their algorithms, as well as for visualization. In this work we present an automated workflow enhanced by data driven techniques that resolves complex quality issues, harmonize sensor drilling data, and report the quality of the dataset to be used for advanced analytics. The approach proposes an automated data quality workflow which formalizes the characteristics, requirements and constraints of sensor data within the context of drilling operations. The workflow leverages machine learning algorithms, statistics, signal processing and rule-based engines for detection of data quality issues including error values, outliers, bias, drifts, noise, and missing values. Further, once data quality issues are classified, they are scored and treated on a context specific basis in order to recover the maximum volume of data while avoiding information loss. This results into a data quality and preparation engine that organizes drilling data for further advanced analytics, and reports the quality of the dataset through key performance indicators. This novel data processing workflow allowed to recover more than 90% of a drilling dataset made of 18 offshore wells, that otherwise could not be used for analytics. This was achieved by resolving specific issues including, resampling timeseries with gaps and different sampling rates, smart imputation of wrong/missing data while preserving consistency of dataset across all channels. Additional improvement would include recovering data values that felt outside a meaningful range because of sensor drifting or depth resets. The present work automates the end-to-end workflow for data quality control of drilling sensor data leveraging advanced Artificial Intelligence (AI) algorithms. It allows to detect and classify patterns of wrong/missing data, and to recover them through a context driven approach that prevents information loss. As a result, the maximum amount of data is recovered for artificial intelligence drilling applications. The workflow also enables optimal time synchronization of different sensors streaming data at different frequencies, within discontinuous time intervals.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.