The bioavailability of iron from any source (e.g., iron supplement, food or meal composite) is considered to be that portion of the total iron which is metabolizable. Philosophically, this concept is important because the amount of iron utilized by avian and mammalian species is directly associated with iron need. When assaying iron bioavailability, it is therefore necessary to use an organism whose need will exceed the amount provided. In animal assays of iron bioavailability, iron need is assured by a growth phase and/or creation of iron deficiency through feeding an iron deficient diet and phlebotomy. Because healthy subjects are usually used in human assays of iron bioavailability (Cook et al., 1981; Cook and Monson, 1976; Radhakrishman and Sivaprasad, 1980), it is inappropriate to compare the data obtained from animal and human assays. In fact it is questionable if assays of iron bioavailability yield good information on the quantities of metabolizable iron available when healthy human subjects are used.The Committee on Dietary Allowances, Food and Nutrition Board, National Academy of Sciences (RDA, 1980) has estimated the amount of metabolizable iron (as absorbable iron) from meals consumed by human beings as ranging from 3 to 23 percent depending on the nature of the meal. For adult women of childbearing age, the committee has assumed that 1.5mg iron is lost daily and that 18mg should be consumed to meet this need. They have therefore assumed that approximately 8.3 percent of the dietary iron will be metabolized. For adult men and women over the age of 51 years, they estimate that 1.0mg iron will be lost daily and recommend that 10mg should be consumed to meet this need to offset only approximately 10 percent of the dietary iron being metabolized by these people. It should be noted, however, that what is metabolized from a food under such conditions does not necessarily reflect what is potentially metabolizable. Indeed, the majority of women of childbearing age consume less than the recommended 18mg iron