Radiation fog nowcasting remains a complex yet critical task due to its substantial impact on traffic safety and economic activity. Current numerical weather prediction models are hindered by computational intensity and knowledge gaps regarding fog‐influencing processes. Machine‐Learning (ML) models, particularly those employing the eXtreme Gradient Boosting (XGB) algorithm, may offer a robust alternative, given their ability to learn directly from data, swiftly generate nowcasts, and manage non‐linear interrelationships among fog variables. However, unlike recurrent neural networks XGB does not inherently process temporal data, which is crucial in fog formation and dissipation. This study proposes incorporating preprocessed temporal data into the model training and applying a weighted moving‐average filter to regulate the substantial fluctuations typical in fog development. Using an ML training and evaluation scheme for time series data, we conducted an extensive bootstrapped comparison of the influence of different smoothing intensities and trend information timespans on the model performance on three levels: overall performance, fog formation and fog dissipation. The performance is checked against one benchmark and two baseline models. A significant performance improvement was noted for the station in Linden‐Leihgestern (Germany), where the initial F1 score of 0.75 (prior to smoothing and trend information incorporation) was improved to 0.82 after applying the smoothing technique and further increased to 0.88 when trend information was incorporated. The forecasting periods ranged from 60 to 240 min into the future. This study offers novel insights into the interplay of data smoothing, temporal preprocessing, and ML in advancing radiation fog nowcasting.