There are pressing concerns about the interplay between agricultural productivity, water demand, and water availability in semi-arid to arid regions of the world. Currently, irrigated agriculture is the dominant water user in these regions and is estimated to consume approximately 80% of the world's diverted freshwater resources. We develop an improved irrigated land-use mapping algorithm that uses the seasonal maximum value of a spectral index to distinguish between irrigated and non-irrigated parcels in Idaho's Snake River Plain. We compare this approach to two alternative algorithms that differentiate between irrigated and non-irrigated parcels using spectral index values at a single date or the area beneath spectral index trajectories for the duration of the agricultural growing season. Using six different pixel and county-scale error metrics, we evaluate the performance of these three algorithms across all possible combinations of two growing seasons (2002 and 2007), two datasets (MODIS and Landsat 5), and three spectral indices, the Normalized Difference Vegetation Index, Enhanced Vegetation Index and Normalized Difference Moisture Index (NDVI, EVI, and NDMI). We demonstrate that, on average, the seasonal-maximum algorithm yields an improvement in classification accuracy over the accepted single-date approach, and that the average improvement under this approach is a 60% reduction in county scale root mean square error (RMSE), and modest improvements of overall accuracy in the pixel scale validation. The greater accuracy of the seasonal-maximum algorithm is primarily due to its ability to correctly classify non-irrigated lands in riparian and developed areas of the study region.