In recent years, there has been a global acceleration in the adoption of distributed energy resources (DERs), due to their potential to decrease net demand and minimize costs associated with transmission and distribution networks. In practice, however, many of them are not situated in load areas, but in remote areas for return of investment, i.e., mostly characterized by high solar radiation, abundant wind resources, and relatively low land-use fees. As a result, the locational mismatches can lead to excessive network construction, significant congestion, and loss costs. To achieve cost-effective grid operation and planning results, it is crucial to locate DERs considering their system level impacts. Since the locational benefits of DERs are not fully assessed for and reflected in their field deployment process today, DERs are not induced to the appropriate sites. To fill this gap, this study quantifies the benefits of diverse DER deployment scenarios using Monte Carlo simulations and provides policy recommendations for utilities and authorities. To estimate the benefits, we conducted a long-term analysis using the transmission expansion planning approach and a short-term analysis based on the optimal power flow methodology. The proposed analysis reveals that the upper 10% scenario of the experimental group with better DER locations can achieve 27% cost reduction than that of the control group. The noteworthy improvement of the well-located scenario for the same amount of DER deployment accounts for a benefit of 1519M$ in the Korean power system case study.