[1] Many field experiments have observed significant temporal variations of thermal infrared (TIR) emission directionality, making it necessary to explain this phenomenon quantitatively to exploit potential applications of the directional remotely sensed TIR observation. The main objective of this paper is to determine when and how the significant directional effect appears. Two models, TRGM and Cupid, are linked to simulate the temporal variations of directional brightness temperature T B ( ) of crop canopies, including winter wheat and summer corn. Two indicators are defined: (1) DT B,AVG representing the mean difference between nadir T B (0) and off-nadir T B (55) and (2) DT B,STD representing the standard deviation of T B (55) for different view azimuth angles. Simulation results show that the highest DT B,AVG of up to 4°C appears mostly at midday (1200-1300 LT), while the lowest DT B,AVG appears mostly in the early morning (0700-0800 LT) or late afternoon (1700-1800 LT). The DT B,STD is about one third of DT B,AVG and should not be neglected given its considerable value at around 1400 LT. This trend has been proven through field measurements at both wheat and corn sites. The major factors affecting the trend are also identified using sensitivity analysis. Among the major factors, soil water content, LAI, and solar radiation are the three most significant factors, whereas the wind speed and air temperature have a larger effect on DT B,AVG than air humidity. It is interesting that DT B,AVG reaches a maximum value when the LAI is around 0.8. Further analysis shows that DT B,AVG is related to soil surface net radiation, which will be useful in net radiation estimation.