Radio frequency (RF) driven H− ion sources are operated at very high power levels of up 100 kW in order to achieve the desired performance. For the experimental setup, these are demanding conditions possibly limiting the source reliability. Therefore, assessing the optimization potential in terms of RF power losses and the RF power transfer efficiency η to the plasma has moved to the focus of both experimental and numerical modeling investigations at particle accelerator and neutral beam heating sources for fusion plasmas. It has been demonstrated that, e.g., at typical neutral beam injection ion source setups, about half of the RF power provided by the generator is lost in the RF coil and the Faraday shield due to Joule heating or via eddy currents. In a best practice approach, it is exemplarily demonstrated at the ITER RF prototype ion source how experimental evaluation accompanied by numerical modeling of the ion source can be used to improve η. Individual optimization measures regarding the Faraday shield, the RF coil, the discharge geometry, the RF driving frequency, and the application of ferrites are discussed, which could reduce the losses by a factor of two. The provided examples are intended as exemplary guidelines, which can be applied at other setups in order to achieve with low-risk effort an optimized ion source design in terms of reduced losses and hence increased reliability.