Flow rate and fluid type (phase) are two of the most fundamental parameters needed to characterize well performance. Traditional methods of estimating these parameters, particularly for real-time detection and diagnosis of production anomalies, have been limited by sampling frequency and data quality. This paper presents field-test results of a new type of downhole multiphase flowmeter, which confirm the value of permanent downhole metering. The meter contains only three sensors but is capable of direct multiphase-flow-rate and cut measurements without slip models, even in highly deviated, recirculating flow. The physics basis and flowloop tests are discussed.
Specimens have long been viewed as critical to research in the natural sciences because each specimen captures the phenotype (and often the genotype) of a particular individual at a particular point in space and time. In recent years there has been considerable focus on digitizing the many physical specimens currently in the world’s natural history research collections. As a result, a growing number of specimens are each now represented by their own “digital specimen”, that is, a findable, accessible, interoperable and re-usable (FAIR) digital representation of the physical specimen, which contains data about it. At the same time, there has been growing recognition that each digital specimen can be extended, and made more valuable for research, by linking it to data/samples derived from the curated physical specimen itself (e.g., computed tomography (CT) scan imagery, DNA sequences or tissue samples), directly related specimens or data about the organism's life (e.g., specimens of parasites collected from it, photos or recordings of the organism in life, immediate surrounding ecological community), and the wide range of associated specimen-independent data sets and model-based contextualisations (e.g., taxonomic information, conservation status, bioclimatological region, remote sensing images, environmental-climatological data, traditional knowledge, genome annotations). The resulting connected network of extended digital specimens will enable new research on a number of fronts, and indeed this has already begun. The new types of research enabled fall into four distinct but overlapping categories. First, because the digital specimen is a surrogate—acting on the Internet for a physical specimen in a natural science collection—it is amenable to analytical approaches that are simply not possible with physical specimens. For example, digital specimens can serve as training, validation and test sets for predictive process-based or machine learning algorithms, which are opening new doors of discovery and forecasting. Such sophisticated and powerful analytical approaches depend on FAIR, and on extended digital specimen data being as open as possible. These analytical approaches are derived from biodiversity monitoring outputs that are critically needed by the biodiversity community because they are central to conservation efforts at all levels of analysis, from genetics to species to ecosystem diversity. Second, linking specimens to closely associated specimens (potentially across multiple disparate collections) allows for the coordinated co-analysis of those specimens. For example, linking specimens of parasites/pathogens to specimens of the hosts from which they were collected, allows for a powerful new understanding of coevolution, including pathogen range expansion and shifts to new hosts. Similarly, linking specimens of pollinators, their food plants, and their predators can help untangle complex food webs and multi-trophic interactions. Third, linking derived data to their associated voucher specimens increases information richness, density, and robustness, thereby allowing for novel types of analyses, strengthening validation through linked independent data and thus, improving confidence levels and risk assessment. For example, digital representations of specimens, which incorporate e.g., images, CT scans, or vocalizations, may capture important information that otherwise is lost during preservation, such as coloration or behavior. In addition, permanently linking genetic and genomic data to the specimen of the individual from which they were derived—something that is currently done inconsistently—allows for detailed studies of the connections between genotype and phenotype. Furthermore, persistent links to physical specimens, of additional information and associated transactions, are the building blocks of documentation and preservation of chains of custody. The links will also facilitate data cleaning, updating, as well as maintenance of digital specimens and their derived and associated datasets, with ever-expanding research questions and applied uses materializing over time. The resulting high-quality data resources are needed for fact-based decision-making and forecasting based on monitoring, forensics and prediction workflows in conservation, sustainable management and policy-making. Finally, linking specimens to diverse but associated datasets allows for detailed, often transdisciplinary, studies of topics ranging from local adaptation, through the forces driving range expansion and contraction (critically important to our understanding of the consequences of climate change), and social vectors in disease transmission. A network of extended digital specimens will enable new and critically important research and applications in all of these categories, as well as science and uses that we cannot yet envision.
A variety of technologies have been used to address the challenges faced by Production Logging (PL) in high deviation and horizontal wells.Different configurations of array sensors have been deployed in these environments, to address well work objectives. There are situations where it is hard to differentiate between qualitative and quantitative answers. The question is how quantitative is high angle and horizontal PL data.Multiple datasets were studied and increased value can be added to array raw data by improvements to processing and interpretation. There is a need to differentiate between data acquisition and data processing and interpretation for high angle and horizontal PL. This paper describes a probabilistic approach developed to combine sensor responses into a quantitative solution. Individual sensors from a multiple array production suite can be used in combination with centralised (conventional) sensors to address reservoir conditions and well access challenges. Difficult well completion and rig height limitation increase the complexity level in such environments. This probabilistic approach is applied in a complex North Sea example to gain greater reservoir understanding with a rapid turnaround.
The newest generation of production logging tools consists of multiple sensors in multiple locations around the wellbore that incorporate 12 resistivity and capacitance probes and six spinners. The capacitance array tool (CAT™) determines the water, oil, and gas holdup in the wellbore. The resistivity array tool (RAT™) determines the holdup of hydrocarbons and water. Likewise, the spinner array tool (SAT™) consists of six bowspring mounted micro-spinners that enable the measurement of the velocity profile. These new tools provide a detailed examination of the flowing fluids in all types of wells, including highly deviated and horizontal wellbores, that is not available with the traditional center sample tools because of the wellbore conditions, especially with fluid segregation. With these 30 measurements, a system of quality control and processing was developed to enable both experienced and non-experienced engineers to determine whether or not the data was correct and valid. A quick analysis tool was developed to enable the field engineer and company representative to enter raw values from the two holdup devices and calibration values, and to determine the holdups from the two sensors. Similarly, entering the raw spinner counts, cable speed, and estimated spinner slopes into the quick analysis tool will provide an estimate of the velocity profile for the SAT spinners and the other spinners that are run. This quick analysis tool graphically shows the holdups and velocity in an easy-to-understand presentation for people who are not production logging (PL) experts. After the raw data in the field is validated, a complete analysis is provided. This analysis includes horizontal, vertical, and 3D images of holdup and velocity profiles; continuous displays of flow profiles; and a complete flow analysis consisting of the split of oil, gas, and water rates at both downhole and surface conditions. This PL data can be presented in standard log formats, spreadsheets, and other methods as needed. This process can be modified by either the service company or customer. Several examples are provided that show the capabilities of the new logging tools and the interpretation method used to determine the results. Introduction Phase segregation occurs in many wells, including those with little deviation from vertical; the lighter phases migrate to the high side of the wellbore, and the heavier phases migrate to the low side. In highly deviated and horizontal wellbores, traditional PL sensors, which are center sample tools or have single point measurements, may not provide the most accurate data as a result of the wellbore and well flowing conditions. These PL tools measure fluid properties, such as velocity, density, capacitance, temperature, and pressure. Tool position, or more accurately sensor position, may lead to incorrect interpretations regarding the flow environment of the well. New PL tools have been developed to help address the issues in deviated or horizontal wells. These new tools include two types of holdup measurements, capacitance and resistivity, as well as multiple velocity measurements. These new tools will be referred to as Production Array Logs (PAL) to distinguish them from the standard PL logs. These tools provide a relative bearing measurement that enables the location of each sensor to be determined. The velocity tool also includes an inclination measurement to aid in the analysis of the PAL data. The holdup tools have 12 measurement probes, and the velocity tool has six spinners. These tools, when run in conjunction with the standard tool string, provide multiple measurements around the entire wellbore. The interpretation of each tool individually is complex and, when combined with the other PAL measurements, the complexity increases dramatically. A new interpretation process was developed that combines the benefits of the newer sensors and addresses problems caused by the deviated and horizontal wellbores in the standard PL interpretation procedures.
A key factor in managing mature fields is to establish adequate surveillance in each phase of their life. The complexity increases when the field is developed with horizontal wells. Differences in data quality and resolution should be taken into consideration when planning such surveillance. Current uncertainties in Harding field relate to unreliable well conformance data using conventional production logs (PL) and assumptions in the reservoir description, which are subseismic resolution. We describe the learning from a horizontal well in Harding, where appropriate surveillance enhanced reservoir understanding and quality of decision making.Based on the initial understanding from the reservoir model, an insert string well work option was proposed to reduce water cut. Historically in this field, conventional PLs provided unreliable well conformance data in horizontal multiphase flow. To improve the characterization at the well scale, an array PL was deployed for the first time on this field.The flowing results revealed that the insert string solution was inappropriate and would result in lost oil production. The shut-in data identified crossflow between two zones separated by a shale section. In the initial model, this shale was mapped only at local level. Post surveillance, it was remapped on seismic as an extensive baffle having an impact on an area with more mobile oil to recover. There is a potential upside with a new infill target being identified toward the toe of this well.Most of the initial decisions about the insert string were based on seismic and modeling work. The new array PL data brought additional information into the model, increasing confidence in the results. Data resolution at the well level matters and this highlights the need to take more PL measurements to calibrate the seismic response and improve the reservoir model.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.