Laboratory measurements of ultrafine titanium dioxide (TiO2) particulate matter loaded on filters were made using three field portable methods (X-ray fluorescence (XRF), laser-induced breakdown spectroscopy (LIBS), and Fourier-transform infrared (FTIR) spectroscopy) to assess their potential for determining end-of-shift exposure. Ultrafine TiO2 particles were aerosolized and collected onto 37 mm polycarbonate track-etched (PCTE) filters in the range of 3 to 578 μg titanium (Ti). Limit of detection (LOD), limit of quantification (LOQ), and calibration fit were determined for each measurement method. The LOD's were 11.8, 0.032, and 108 μg Ti per filter, for XRF, LIBS, and FTIR, respectively and the LOQ's were 39.2, 0.11, and 361 μg Ti per filter, respectively. The XRF calibration curve was linear over the widest dynamic range, up to the maximum loading tested (578 μg Ti per filter). LIBS was more sensitive but, due to the sample preparation method, the highest loaded filter measurable was 252 μg Ti per filter. XRF and LIBS had good predictability measured by regressing the predicted mass to the gravimetric mass on the filter. XRF and LIBS produced overestimations of 4% and 2%, respectively, with coefficients of determination (R(2)) of 0.995 and 0.998. FTIR measurements were less dependable due to interference from the PCTE filter media and overestimated mass by 2% with an R(2) of 0.831.