Energisation of large power transformers may cause significant voltage dips, of which the severity largely depends on a number of parameters, including circuit breaker closing time, transformer core residual flux and core saturation characteristic, and network conditions. Since most of the parameters are of stochastic nature, Monte Carlo simulation was conducted in this study to stochastically assess the voltage dips caused by transformer energisation in a 400 kV grid, using a network model developed and validated against field measurements. A dip frequency pattern was identified over 1000 stochastic runs and it was found to be sensitive to residual flux distribution but insensitive to closing offset time distribution. The probability of reaching the worst case dip magnitude (estimated under the commonly agreed worst energisation condition) was found to be lower than 0.5%; about 80% of the dips are likely to be with magnitudes lower than 0.6 pu of the worst case. Nevertheless, there are dips with magnitudes exceeding the worst case dip magnitude, indicating the inadequacy of deterministic assessment approach by using the commonly agreed worst energisation condition.