Pipe leakage is an inevitable phenomenon in water distribution networks (WDNs), leading to energy waste and economic damage. Leakage events can be reflected quickly by pressure values, and the deployment of pressure sensors is significant for minimizing the leakage ratio of WDNs. Concerning the restriction of realistic factors, including project budgets, available sensor installation locations, and sensor fault uncertainties, a practical methodology is proposed in this paper to optimize pressure sensor deployment for leak identification in terms of these realistic issues. Two indexes are utilized to evaluate the leak identification ability, that is, detection coverage rate (DCR) and total detection sensitivity (TDS), and the principle is to determine priority to ensure an optimal DCR and retain the largest TDS with an identical DCR. Leakage events are generated by a model simulation and the essential sensors for maintaining the DCR are obtained by subtraction. In the event of a surplus budget, and if we suppose the partial sensors have failed, then we can determine the supplementary sensors that can best complement the lost leak identification ability. Moreover, a typical WDN Net3 is employed to show the specific process, and the result shows that the methodology is largely appropriate for real projects.