The Arctic region holds vast amounts of extractive energy resources. Most of the arctic resources lie offshore beneath thick ice and deep water in environmentally very sensitive areas. Weather and distance from existing infrastructure and centers of population add additional operational and logistic challenges.Oil and gas operations in the high north are likely to entail the remote and distributed control of assets -leading to heavy demands on the communication links and information flow. Connecting and integrating business processes and information sources across disciplines, geographical locations and organizational boundaries add to the complexity. Further, operations in the high north require particular sensitivity to environmental aspects.In order to meet all the requirements and at the same time maintain profitable operations, the industry has to create new field development and operation concepts that include heavily instrumented facilities. There must also be put a significant focus on the transfer of real time data between fields and operation centers located elsewhere, and on automated key work processes to handle the large volumes of data. A prerequisite for this development is a robust digital infrastructure and a platform for effective and efficient information exchange. This is what the Norwegian Oil Industry Association has called Integrated Operations Generation 2. Central to this vision is a much higher degree of functionalities being distributed geographically and organizationally based upon a digital platform that is distributed to a higher extent than what is common today. The platform defines interfaces and employs data transfer and distributed intelligence based on open standards, allowing for a much higher degree of interoperability across applications, disciplines, geographic locations and organizations than is common today.The main aim for the "Integrated Operations in the High North" project is to build and demonstrate a digital platform for Integrated Operations Generation 2. The project is a unique collaboration between the ICT, defense and oil and gas industries, as well as university research groups. During a four year period starting May 2008, the 26 participants in the project are working together to develop a demonstrated platform for Integrated Operations Generation 2. Pilots defined for drilling, production and operations and maintenance, are used as a basis to generate the requirements early in the project, and at later stages in the project to demonstrate the value of the platform.The project is also relevant for gaining experience with where and how cutting edge ICT technologies, like semantic and agent technologies that target the "Internet of things", may generate value within the oil & gas industry.
The Arctic region holds vast amounts of extractive energy resources. Most of the arctic resources lie offshore beneath thick ice and deep water in environmentally very sensitive areas. Weather and distance from existing infrastructure and centers of population add additional operational and logistic challenges.Oil and gas operations in the high north are likely to entail the remote and distributed control of assets -leading to heavy demands on the communication links and information flow. Connecting and integrating business processes and information sources across disciplines, geographical locations and organizational boundaries add to the complexity. Further, operations in the high north require particular sensitivity to environmental aspects.In order to meet all the requirements and at the same time maintain profitable operations, the industry has to create new field development and operation concepts that include heavily instrumented facilities. There must also be put a significant focus on the transfer of real time data between fields and operation centers located elsewhere, and on automated key work processes to handle the large volumes of data. A prerequisite for this development is a robust digital infrastructure and a platform for effective and efficient information exchange. This is what the Norwegian Oil Industry Association has called Integrated Operations Generation 2. Central to this vision is a much higher degree of functionalities being distributed geographically and organizationally based upon a digital platform that is distributed to a higher extent than what is common today. The platform defines interfaces and employs data transfer and distributed intelligence based on open standards, allowing for a much higher degree of interoperability across applications, disciplines, geographic locations and organizations than is common today.The main aim for the "Integrated Operations in the High North" project is to build and demonstrate a digital platform for Integrated Operations Generation 2. The project is a unique collaboration between the ICT, defense and oil and gas industries, as well as university research groups. During a four year period starting May 2008, the 26 participants in the project are working together to develop a demonstrated platform for Integrated Operations Generation 2. Pilots defined for drilling, production and operations and maintenance, are used as a basis to generate the requirements early in the project, and at later stages in the project to demonstrate the value of the platform.The project is also relevant for gaining experience with where and how cutting edge ICT technologies, like semantic and agent technologies that target the "Internet of things", may generate value within the oil & gas industry.
For anyone involved in organizational changes, people development, optimization of work processes, and reduction in overall risk exposure at the rigsite, the ultimate goal is to develop automation where possible. Automation has been used for years in the car industry, and the oil industry is slowly adopting the potential of automated systems for the drilling environment, gradually integrating all associated processes or important downhole data in full range.Baker Hughes, with strong support from Statoil, has since 1999 developed a remote operations model based on the Baker Expert Advisory Centre/Operations Network /BEACON) platform, starting with remote operations monitoring from an onshore operations centre. Subject matter experts were placed in the operations centre to process the data in real time, leading to significant changes in work processes both on-and offshore. New positions were developed and new shift plans were implemented. As new downhole tools were developed new service levels such as drilling optimization, ECD management and reservoir navigation services were introduced, all remotely from the operations centre requiring no additional personnel at rigsite, all made possible by rig connectivity, data transfer capabilities and proper work process delineation.Tomorrow's solution will integrate all available surface and downhole data, and automated advisory systems will deliver advice based on a wider range of combined real-time data, as well as historical databases and best practice, empowering individual judgment and assumptions. This change will significantly contribute to improved operational performance as well as risk mitigation and reduced Health, safety and Environment (HS&E) exposure.Drilling process automation is something the industry has been anticipating. This paper will discuss the automation potential with respect to remote operations ability, the required development of traditional field positions, collaboration models (e.g., operator/service provider/rig contractor), improved operational efficiency and expectations for operational cost reductions.
Data quality issues have for many decades been a problem for drilling data. To some extent, development of data transfer standards has helped out in achieving better data quality and data transport. In the early stages of WITSML, poor data quality was a concern and in this paper we will be looking at various steps that have been taken to improve data quality. Sensor technology has improved a lot in recent years with fieldbus options which allow for remote calibration and diagnostic. In addition calibration routines are streamlined and range checks can be implemented at point of acquisition. The data acquisition software now has some inbuilt quality control to addresses errors in manual data input. In addition we have developed software at the rig-site that will perform several data quality checks in the database. After acquisition, the data is converted and transferred to a central hosted WITSML 1.4.1.1 server. Here several applications will perform data quality assurance on the data, e.g. to check for data gaps. In addition the data flow is monitored 24/7 from an operation center before data is consumed by several applications. We have been working closely with one operator for several years to improve processes in WITSML data deliveries. To ensure there is an agreement of what data is expected to be delivered, this company has established electronic order forms that will be sent to us for quality check before the section starts. In addition this operator has developed a sophisticated data quality monitoring system that will produce KPI scores linked to the SLA. Some results from research in using statistics to uncover abnormal sensor response in acquired data will also be presented. Statistic will show how data quality is improving while the amount of data is acquired from one rig is increasing year by year.
The development of remote operational centers has improved data quality due to the increased focus on data acquisition and real-time usage of data. Data quality has different connotations to various participants in the oilfield. Data streaming and continuous flow of data is the main focus for the technical part in delivering and receiving data. Scientists focus on the accuracy of each data point. Most surface sensors measure milliamps with calibrations for accuracy. Reservoir measurements have porosity and permeability as the main reservoir properties, values that are not measured directly and derived from other sources. In automation processes data streaming must be flawless. Through remote operations the surveillance of data streaming quality and accuracy is performed. In most areas the response time on the network is too narrow to stream data from the well to a remote location and to then stream a command or solution back for full closed-loop control. During the acquisition, aggregation, distribution and finally visualization there are room for changes in the data point s. These changes are from uncertainty, stacking, filtration, unpacking, transmission, and any other data handling process for data sharing. For the digital oil field, data are evaluated in two different settings, real-time and after the event. The interpretation is performed either remotely or at the wellsite. There is room for improvement in all areas, depending on objectives in the process: -Automation in the operational phase-Interpretation based on a model update-Automated quality control This paper illustrates the differences and similarities between real-time operations and processes performed on the data later and how combined local and remote operations enhance data quality. We will follow up by making improvement suggestions in all areas moving into the digital oil field.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.