This paper summarizes the results obtained from a comprehensive, joint-industry field experiment designed to improve the understanding of the mechanics and modeling of the processes involved in the downhole injection of drill cuttings. The project was executed in three phases: drilling of an injection well and two observation wells (Phase 1); conducting more than 20 intermittent cuttings-slurry injections into each of two disposal formations while imaging the created fractures with surface and downhole tiltmeters and downhole accelerometers (Phase 2); and verifying the imaged fracture geometry with comprehensive deviated-well (4) coring and logging programs through the hydraulically fractured intervals (Phase 3).Drill cuttings disposal by downhole injection is an economic and environmentally friendly solution for oil and gas operations under zero-discharge requirements. Disposal injections have been applied in several areas around the world and at significant depths where they will not interfere with surface and subsurface potable water sources. The critical issue associated with this technology is the assurance that the cuttings are permanently and safely isolated in a cost-effective manner.The paper presents results that show that intermittent injections (allowing the fracture to close between injections) create multiple fractures within a disposal domain of limited extent. The paper also includes the conclusions of the project and an operational approach to promote the creation of a cuttings disposal domain. The approach introduces fundamental changes in the design of disposal injections, which until recently was based upon the design assumption that a large, single storage fracture was created by cuttings injections.
Summary. A performance model for polycrystalline-diamond-compact (PDC) bits was modified to include the wear prediction of individual cutters. Wear predictions are based on the geometry of each cutter, the rock type, the forces acting on the cutter, cutter velocity, and cutter temperature. The results of predicted and actual cutter wear for different 8 1/2-in. (21.6-cm) field-worn PDC bit designs are shown. The worn geometry of each cutter was measured and compared with the model's prediction. Laboratory performance tests were also conducted with similar bit designs to quantify the effect of bit imbalance. Reduced rate of penetration (ROP), drilling an overgauge hole, and increased cutter wear and breakage can result from bit imbalance. Introduction The successful use of PDC bits depends on obtaining a sufficient ROP and length of run to make their application economical. Both the ROP and length of run (bit life) depend on the bit design, operating parameters, and formation properties, Changes in operating parameters, which result in an increase in the ROP, such as increasing weight on bit (WOB) or rotary speed, will also increase bit wear rate. Field drilling experience has shown that in some formations no PDC bit design has been economical, while in other formations specific bits have been identified as halving a high probability of success. The performance of identical bits run in offset wells is often quite different. This paper describes experimental testing and analytical modeling being used to gain a better understanding of the interaction of the variables that determine the success or failure of a particular bit run. A portion of the work dealing with modeling cutter forces and instantaneous bit Performance was presented by Warren and Sinor. The current paper presents an expansion of this work to include the modeling of the abrasive wear of a PDC bit and laboratory observations of the effects of dynamic forces on the bit. Types of Wear PDC bit cutter wear can be divided into two categories, depending on the basic cause of the wear. The first category, abrasive wear, is steady-state wear that is normally associated with the development of uniform wear-flats and the gradual degradation in ROP over the bit life. It is a function of the force applied to the cutter, cutter temperature, cutter velocity, formation properties, and cutter properties. Abrasive wear was modeled and relatively accurate predictions were made of the detailed cutter wear experienced on bits run in the field. The second category of wear is the result of dynamic loading of the cutters. This form of wear is typified by chipped, broken, and lost cutters. Dynamic loading can be caused by abrupt changes in the surface drillstring control or by forces induced by cutter/rock interaction. The current work concentrates on evaluating the cutter placement to determine the inherent stability of a particular bit design. The stability is determined by evaluating the radial and circumferential forces that tend to cause the bit to rotate about an axis other than the center of the hole. Force balancing of PDC bits is generally recognized to be important for optimum performance and bit life, but its application to commercial designs historically has been made by placing the cutters where they appeared to be "balanced" on the basis of their geometrical location only. If the radial forces acting on the bit are not balanced, the bit tends to rotate off-center, resulting in reduced ROP and accelerated wear. Drilling with an imbalanced bit can sometimes result in an overgauge hole, depending on the amount of gauge stabilization and the amount of applied WOB. The model presented in this paper calculates an imbalance force and direction but does not predict the effect on performance or wear. Model Background In the early 1970's, Larsen-Basse surveyed the literature on wear of hard metals and concluded that abrasion and thermal fatigue were the primary causes of wear-flat development. Laboratory wear studies in Jack Fork sandstone showed that PDC cutter wear depended on cutter speed and wear-flat temperature . The wear study suggested that increases in temperature result in exponential increases in wear rate. Glowka and Stone discussed the wear mechanisms for PDC bits and the dependence of wear on cutter temperature. Above 1,382 degrees F [750 degrees C], wear was shown to accelerate because of thermal deterioration and diamond grain pullout, resulting in catastrophic cutter failure. At temperatures below 1,382 degrees F [750 degrees C], the primary mode of wear was described as microchipping abrasive wear. A log-log plot of experimental wear rates vs. wear-flat temperature showed that the wear rate increased dramatically above 662 degrees F [350 degrees C]. Because of the accelerated wear rate above 662 degrees F [350 degrees C], it is defined as the critical cutter temperature. Glowka and Ortega's temperature model, along with an empirical wear function based on laboratory and field data, was used in the model discussed in this paper. Glowka derived the following expression for wear-flat temperature: The thermal response function, f, is the effective thermal resistance of the cutter and is a function of cutter configuration, thermal properties, and cooling rates. The temperature, and thus wear rate, of a PDC cutter is also affected by cutter balling. It is generally agreed that certain mud types reduce the tendency for bit balling. The model does not predict cutter balling, but the effect of balling on cutter cooling can be simulated by changing the convective heat-transfer coefficient, h. Glowka and Stone showed the reduction of cutter cooling as a function of cooling rate and rock-flour thickness to be a factor of 10 for shale 0.039 in. [0.1 cm] thick on the cutter face. The actual wear rate of a cutter is a function of its contact stress and temperature. The model presented in this section relies heavily on Glowka and Stone's work to provide an estimate of cutter temperature, which is then related to wear rate. Nonlinear regression analysis was used to develop an equation for the tabular data presented by Glowka and Stone for f as a function of cooling and cutter wear-flat area (WFA). A constant cooling rate of 1,800 Btu/hr-ft2 degrees F) [1.0 (W/CM2. degrees C)] was assumed for the calculation. Using this information, along with values available from the PDC force model presented in Ref. 1, the cutter temperature can be calculated from the equation above. Once the cutter temperature is known, the wear rate is estimated from an empirical relationship between wear rate in Jack Forks sandstone and cutter temperature. A relative formation abrasiveness is used to relate the wear rate in Jack Fork sandstone to the particular rock being drilled by the model. Each cutter is divided into a number of discrete elements in the model. The height worn off each element for a particular timestep is calculated from the previously determined wear rate. Wear-flat length is based on the height worn off the element and cutter geometry. The axial force applied to the cutter is divided into a non-productive component supported on the WFA and a component that causes chip generation. Fig. 1 is a flow chart of the PDC performance model and Fig. 2 is a schematic of a cutter with wear-flat development in progress. In addition to incorporating bit wear into the model, the cutter force model was modified to provide a more general prediction.
A detailed test program was performed with an eccentric tool at the Baker Hughes Experimental Test Area (BETA) field research facility to evaluate the feasibility of its use in an Expandable Tubular Technology application in the North Sea. The testing used a 9–7/8" Drill Out Steerable Ream While Drilling (DOSRWD) tool in conjunction with 6–1/2" pilot bits (both PDC and roller cone). Motor bent housing settings included 1.0°, 1.5°, 1.75° and 2.0° bends to evaluate directional and stability response. Surface speeds were varied from 0, 35, 50 and 75 rpm at each motor housing setting. Caliper logs including four and six-arm and ultrasonic borehole imaging (UBI) tools were used to characterize the borehole under all conditions. The analysis included directional tendencies, down hole vibration monitoring and borehole diameter, quality and degradation over time. The test results show the 9–7/8" DOSRWD system is capable of providing the high quality wellbore required for expandable tubular technology, ensuring the casing can be run, expanded and isolated across the formation. Introduction Expandable tubular technology has the potential to significantly reduce well construction costs. Conventional well construction results in telescoping of the well size from the wellhead down to the reservoir. Apart from resulting in large expensive surface casing, wellheads, trees and operating equipment, the method can result in an unworkable small hole size at the required depth. This could then lead to compromises in well operability or in worst case failure to reach the final objective. Expandable tubulars can help solve difficult drilling challenges posed by high-pressure zones, deepwater environments and troublesome sub-salt plays.1,2,3,4,5 Its innovative characteristics allow operators to explore in remote geologic regions and exploit reserves once considered unprofitable if drilled with conventional technology. Instead of using progressively smaller diameter pipe as drilling progresses deeper, Expandable Tubular Technology allows tubular diameters to be expanded with specially designed "pigs," or mandrels. This reduces well tapering while preserving borehole size. Expandable technology can also extend the profitable life of mature fields by internally cladding existing wellbores to isolate troublesome zones. This developing technology has created a need for improved understanding of the directional tendencies of eccentric drilling tools run on steerable assemblies and the wellbore geometry and quality that can be achieved with these tools. Consistent wellbore diameter is of particular concern for expandable tubulars. If the wellbore diameter is too small, expansion of the pipe with a fixed diameter cone might not proceed properly across sections of firm formation. Worse yet, the expansion cone could become stuck requiring remediation or sidetrack of the well. A wellbore that is too large could affect the sealing effectiveness depending on the sealing system used. For example, a closer diameter tolerance would be required if the seal mechanism is an integral part of the casing (elastomer bonded to the outside of the casing).
Models for the forces required to remove a fixed volume of rock with a single cutter have been applied to different polycrystalline Diamond Compact (PDC) bit designs. The integration of the forces for each cutter over the bit face gives the torque and weight-on-bit (WOB) required for a particular rate-of-penetration (ROP). This paper presents the results of comparing such a model to laboratory drilling tests for four radically different bit designs in four different rocks. The geometry of each cutter on the bit was determined by detailed measurements of the bit with a 3-axis coordinate measuring machine. Drilling tests were conducted by "reaming" rock with an 8-1/2" bit that had been previously drilled with a 6-1/4" bit. Only the peripheral cutters engaged the rock during the reaming. Additional tests were conducted where the WOB and torque were recorded while drilling a pilot hole into the top of a flat rock and drilling through two different bed boundaries with constant ROP. The model predictions compared well to the measured data for both the reaming tests and the pilot hole tests.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.