Use of X‐ray fluorescence (XRF) spectrometry for estimation of soil texture, pH, and cation exchange capacity (CEC) is desirable given the time‐consuming nature of traditional methods. Recent studies have shown promising results; however, further investigation is required to determine the effects of sample preparation and data evaluation techniques on accuracy. Our objective was to compare (I) a simple but well‐founded approach, combining measurement of powder samples and modeling with elemental contents as predictors in stepwise multiple linear regressions (MLR), with alternative approaches including (II) use of partial least squares regression (PLSR), (III) sample preparation as a pressed pellet, and (IV) spectral intensities as predictors (20 kV, 40 kV, and concatenated 20 + 40 kV). A total of 395 loess soils from three arable fields were used with a fivefold random training‐testing approach and a hold‐one‐site‐out training‐testing approach. With random partitioning, clay, silt, and sand accuracy with approach I was excellent (ratio of performance to interquartile distance in validation (RPIQv) = 8.5–12.9), while pH and CEC estimations were satisfactory to excellent (RPIQv = 2.0–2.5 and 2.2–3.3, respectively). Differences between MLR and PLSR were negligible. Increases in accuracy with pellet samples were 1%–13% of RPIQv for 20 kV intensities, but effects were inconsistent for other predictors. The optimal predictor varied by property, and differences ranged from 3% to 13% of RPIQv. Improvements to accuracy from Approach I to the best alternative were largest for texture (10%–15%) but may be superfluous given the excellent accuracy across all approaches. Although the leave‐one‐site‐out training resulted in variable performance, inclusion of soils from the target site in training assured reliable accuracy.