From the present study, niobium additions of 1.79% and 3.98% were added to a 15% Cr–3% C white iron, and their effects on the microstructure, hardness and abrasive wear were analyzed. The experimental irons were melted in an open induction furnace and cast into sand molds to obtain bars of 45 mm diameter. The alloys were characterized by optical and electron microscopy, and X-ray diffraction. Bulk hardness was measured in the as-cast conditions and after a destabilization heat treatment at 900 °C for 30 min. Abrasive wear resistance tests were undertaken for the different irons according to the ASTM G65 standard in both as-cast and heat-treated conditions under three loads (58, 75 and 93 N). The results show that niobium additions caused a decrease in the carbon content in the alloy and that some carbon is also consumed by forming niobium carbides at the beginning of the solidification process; thus decreasing the eutectic M7C3 carbide volume fraction (CVF) from 30% for the base iron to 24% for the iron with 3.98% Nb. However, the overall carbide content was constant at 30%; bulk hardness changed from 48 to 55 hardness Rockwell C (HRC) and the wear resistance was found to have an interesting behavior. At the lowest load, wear resistance for the base iron was 50% lower than that for the 3.98% Nb iron, which is attributed to the presence of hard NbC. However, at the highest load, the wear behavior was quite similar for all the irons, and it was attributed to a severe carbide cracking phenomenon, particularly in the as-cast alloys. After the destabilization heat treatment, the wear resistance was higher for the 3.98% Nb iron at any load; however, at the highest load, not much difference in wear resistance was observed. Such a behavior is discussed in terms of the carbide volume fraction (CVF), the amount of niobium carbides, the amount of martensite/austenite in matrix and the amount of secondary carbides precipitated during the destabilization heat treatment.