Learning from successful safety outcomes, or what went right, is an important emerging component of maintaining safe systems. Accordingly, there are increasing calls to study normal performance in near misses as a part of safety management activities. Despite this, there is limited guidance on how to accomplish this in practice. This article presents a study in which using Rasmussen's risk management framework to analyse 16 serious incidents from the aviation domain. The findings show that a network of protective factors prevents accidents with factors identified across the sociotechnical system. These protective networks share many properties with those identified in accidents. The article demonstrates that is possible to identify these networks of protective factors from incident investigation reports. The theoretical implications of these results and future research opportunities are discussed. Practitioner Statement: The analysis of near misses is an important part of safety management activities. This article demonstrates that Rasmussen?s risk management framework can be used to identify networks of protective factors which prevent accidents. Safety practitioners can use the framework described to discover and support the system-wide networks of protective factors.
Data quality issues have for many decades been a problem for drilling data. To some extent, development of data transfer standards has helped out in achieving better data quality and data transport. In the early stages of WITSML, poor data quality was a concern and in this paper we will be looking at various steps that have been taken to improve data quality. Sensor technology has improved a lot in recent years with fieldbus options which allow for remote calibration and diagnostic. In addition calibration routines are streamlined and range checks can be implemented at point of acquisition. The data acquisition software now has some inbuilt quality control to addresses errors in manual data input. In addition we have developed software at the rig-site that will perform several data quality checks in the database. After acquisition, the data is converted and transferred to a central hosted WITSML 1.4.1.1 server. Here several applications will perform data quality assurance on the data, e.g. to check for data gaps. In addition the data flow is monitored 24/7 from an operation center before data is consumed by several applications. We have been working closely with one operator for several years to improve processes in WITSML data deliveries. To ensure there is an agreement of what data is expected to be delivered, this company has established electronic order forms that will be sent to us for quality check before the section starts. In addition this operator has developed a sophisticated data quality monitoring system that will produce KPI scores linked to the SLA. Some results from research in using statistics to uncover abnormal sensor response in acquired data will also be presented. Statistic will show how data quality is improving while the amount of data is acquired from one rig is increasing year by year.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.