A long‐standing question surrounding rifted margins concerns how the observed fault‐restored extension in the upper crust is usually less than that calculated from subsidence models or from crustal thickness estimates, the so‐called “extension discrepancy.” Here we revisit this issue drawing on recently completed numerical results. We extract thinning profiles from four end‐member geodynamic model rifts with varying width and asymmetry and propose tectonic models that best explain those results. We then relate the spatial and temporal evolution of upper to lower crustal thinning, or crustal depth‐dependent thinning (DDT), and crustal thinning to mantle thinning, or lithospheric DDT, which are difficult to achieve in natural systems due to the lack of observations that constrain thinning at different stages between prerift extension and lithospheric breakup. Our results support the hypothesis that crustal DDT cannot be the main cause of the extension discrepancy, which may be overestimated because of the difficulty in recognizing distributed deformation, and polyphase and detachment faulting in seismic data. More importantly, the results support that lithospheric DDT is likely to dominate at specific stages of rift evolution because crustal and mantle thinning distributions are not always spatially coincident and at times are not even balanced by an equal magnitude of thinning in two dimensions. Moreover, either pure or simple shear models can apply at various points of time and space depending on the type of rift. Both DDT and pure/simple shear variations across space and time can result in observed complex fault geometries, uplift/subsidence, and thermal histories.