With the advent of horizontal drilling and fracturing low-permeability reservoirs to produce hydrocarbons, it has become common practice within most fields to drill wells in the direction of minimum horizontal stress in a normal stress region to maximize production and minimize drilling risk. The primary reason for this is that induced fractures tend to propagate along the plane containing the intermediate and maximum stresses (typically overburden and maximum horizontal stresses in a normally stressed region), which, under these conditions, is perpendicular to the horizontal wellbore. This allows for maximum extension away from the wellbore into the reservoir allowing for maximum reservoir contact by the induced fractures. This has been consistently demonstrated to provide the maximum coverage.
However, it is sometimes the case that wells are drilled at an azimuth that is not in line with the minimum stress. This might be due to such things as irregular lease boundaries or perhaps even variations in localized stress regimes that cause the stress direction to shift from the expected regional stress directions. These changes in a drilled well's azimuth with respect to minimum in-situ stress can have a significant effect on stimulation effectiveness and, eventually, the estimated ultimate recovery (EUR) of the wells.
A study of off-azimuth drilling addressed issues of how to determine minimum stress direction, the effect of such drilling on wellbore stability and the safe mud-weight window, how the completion design might be affected, and how to estimate the expected changes in productivity. Theoretical and empirical data were used to quantify these effects; data used included wireline logs, publicly available production data, microseismic data, and computer simulations, with a particular emphasis on the Marcellus shale.