This study utilizes a version of the Advanced Regional Prediction System (ARPS) with a canopy submodel (ARPS‐CANOPY) to evaluate the sensitivity of cold‐air pool evolution to forest canopy density and valley geometry and elucidate the underlying processes. Numerical experiments are conducted with forest canopy structures spanning from bare ground to dense canopies and terrain configurations ranging from small, shallow valleys to broad, deep valleys. In a set of experiments in which forest canopy density is varied, the minimum potential temperature in the cold‐air pool is found to be as much as 15 K warmer with a dense canopy than with no canopy. Analysis of the thermodynamic budget reveals weaker cooling rates in the dense canopy case than the sparse canopy case, with differences in cooling rates persisting through the duration of the simulation. An additional experiment in which cooling of canopy elements and the influence of the canopy on the ground radiation budget are neglected highlights the important role the canopy plays in limiting radiative loss from the ground surface along the sidewall and valley floor. In experiments in which valley geometry is varied, the cold‐air pool is found to be strongest in a medium valley (10 km wide, 187.5 m deep) and weakest in a small valley (2 km wide, 37.5 m deep). Analysis of along‐slope buoyancy suggests that downslope flow‐driven cooling in a medium valley is more efficient than in a large valley (30 km wide, 500 m deep).