Over the past two decades, Diagnostic Fracture Injection Tests (DFIT), which have also been referred to as Injection-Falloff Tests, Fracture Calibration Tests, Mini-Frac Tests in the literature, have evolved into a commonly used and reliable technique to evaluate reservoir properties, fracturing parameters and obtain in-situ stresses. Since the introduction of DFIT analysis based on G-function and its derivative, this method has become standard practice for quantifying minimum in-situ stress and leak-off coefficient. However, the pressure decline model that underlies the G-function plot makes two distinct and important assumptions: (1) leak-off is not pressure-dependent and, (2) fracture stiffness (or compliance) is assumed to be constant during fracture closure. In this study, we first review Nolte's original G-function model and examine the assumptions inherent in the model. We then present a new global pressure transient model for pressure decline after shut-in which not only preserves the physics of unsteady-state reservoir flow behavior, elastic fracture mechanics and material balance, but also incorporates the gradual changes of fracture stiffness (or compliance) due to the contact of rough fracture walls during closure. Analysis of synthetic cases, along with field data are presented to demonstrate how the coupled effects of fracture geometry, fracture surface asperities, formation properties, pore pressure and wellbore storage can impact fracturing pressure decline and the estimation of minimum in-situ stress. It is shown that using Carter's leak-off is an oversimplification that leads to significant errors in the interpretation of DFIT data. Most importantly, this article reveals that previous methods of estimating minimum in-situ stress often lead to significant over or underestimates. Based on our modeling and simulation results, we propose a much more accurate and reliable method to estimate the minimum in-situ stress and fracture pressure dependent leak-off rate.