In the risk assessment process, the reference dose, tolerable intake, or acceptable daily intake (RfD, TDI, ADI) is apportioned to specific exposure sources on the basis of a source allocation factor (AF) or relative source contribution (RSC). The U.S. Environmental Protection Agency (EPA) published an exposure decision tree framework in 2000 to guide the determination of AF (or RSC) of drinking-water contaminants (DWC). Besides that, there has not been any systematic analysis of the basis of the use of AF in DWC risk assessments. This article therefore critically reviews and integrates current knowledge and approaches for the development of AF, while focusing on its consistent use in DWC risk assessments based on consideration of (i) risk assessment endpoint, (ii) existing guidelines, (iii) exposure estimates, (iv) usage pattern and environmental fate information, (v) physicochemical properties, (vi) bounds of AF, (vii) multiroute exposures, and (viii) target population characteristics. Accordingly, for a DWC for which drinking water is not a major source of exposure and for which there is documented evidence of widespread presence in one or more of the other media (i.e., air, food, soil, or consumer products), the use of an AF value of 0.2 is suggested. For DWC for which drinking water represents nearly the single major source of exposure, a ceiling AF value of 0.8 is suggested. For other situations, chemical- and context-specific AF values can be developed based on exposure data or models, which should in turn be bounded by the floor and ceiling AF values as originally described by the U.S. EPA (i.e., 0.2-0.8). Future studies need to focus on improvements in methods for deriving AF, by basing it on the consideration of bioavailability, target tissue dose, and extent of route-specific absorption, as well as improvement in the modeling of dose received via direct/voluntary exposure through consumer products and at workplaces.