Crop phenology models that use constant temperature parameters across developmental stages may be less accurate and have temperature‐dependent systematic prediction error (bias). Using the DD10 model, we evaluated default and optimized (DD_Opt) temperature parameters using data from seven California rice (Oryza sativa L.) cultivars grown in six locations over 3 yr (2012–2014). Furthermore, we evaluated the effect of using stage‐dependent temperature parameters on model performance using two‐ and three‐stage optimization approaches. Optimized temperature parameters, or DD_Opt (RMSE: 2.3–5.4 d), performed better than DD10 (RMSE: 2.9–7.3 d). A temperature sensitivity analysis indicated that the time from planting to panicle initiation was most sensitive to temperature (every 1°C increase decreased days to panicle initiation by 1.8 d) while time from heading to R7 (marked by the appearance of one yellow hull on the main stem panicle) was not affected by temperature. Optimized temperature parameters varied between stages, with base temperature decreasing and optimum temperature increasing with plant development. Compared to the DD_Opt, two‐stage optimization (planting–heading and heading–R7) reduced the RMSE by 0.8 d and the systematic error by 0.6 d °C−1. Three‐stage optimization (planting–panicle initiation, panicle initiation–heading, and heading–R7) further reduced RMSE by 1.1 d and systematic error by 1.4 d °C−1 for preheading. These results demonstrate the importance of using stage‐dependent parameters to improve accuracy of phenological models, which may be important when models are used to study the crop response to climate change, field management options, ecosystem productivity, breeding, and yield gap analysis.