In cases when an equivalent porous medium assumption is suitable for simulating groundwater flow in bedrock aquifers, estimation of seepage into underground mine workings (UMWs) can be achieved by specifying MODFLOW drain nodes at the contact between water bearing rock and dewatered mine openings. However, this approach results in significant numerical problems when applied to simulate seepage into an extensive network of UMWs, which often exist at the mine sites. Numerical simulations conducted for individual UMWs, such as a vertical shaft or a horizontal drift, showed that accurate prediction of seepage rates can be achieved by either applying grid spacing that is much finer than the diameter/width of the simulated openings (explicit modeling) or using coarser grid with cell sizes exceeding the characteristic width of shafts or drifts by a factor of 3. Theoretical insight into this phenomenon is presented, based on the so-called well-index theory. It is demonstrated that applying this theory allows to minimize numerical errors associated with MODFLOW simulation of seepage into UMWs on a relatively coarse Cartesian grid. Presented examples include simulated steady-state groundwater flow from homogeneous, heterogeneous, and/or anisotropic rock into a vertical shaft, a horizontal drift/cross-cut, a ramp, two parallel drifts, and a combined system of a vertical shaft connected to a horizontal drift.