Meeting the ever-increasing global food, feed, and fiber demands while conserving the quantity and quality of limited agricultural water resources and maintaining the sustainability of irrigated agriculture requires optimizing irrigation management using advanced technologies such as soil moisture sensors. In this study, the performance of five different soil moisture sensors was evaluated for their accuracy in two irrigated cropping systems, one each in central and southwest Oklahoma, with variable levels of soil salinity and clay content. With factory calibrations, three of the sensors had sufficient accuracies at the site with lower levels of salinity and clay, while none of them performed satisfactorily at the site with higher levels of salinity and clay. The study also investigated the performance of different approaches (laboratory, sensor-based, and the Rosetta model) to determine soil moisture thresholds required for irrigation scheduling, i.e., field capacity (FC) and wilting point (WP). The estimated FC and WP by the Rosetta model were closest to the laboratory-measured data using undisturbed soil cores, regardless of the type and number of input parameters used in the Rosetta model. The sensor-based method of ranking the readings resulted in overestimation of FC and WP. Finally, soil moisture depletion, a critical parameter in effective irrigation scheduling, was calculated by combining sensor readings and FC estimates. Ranking-based FC resulted in overestimation of soil moisture depletion, even for accurate sensors at the site with lower levels of salinity and clay.