This paper presents a mathematical approach to model the effects of phenomena
with random nature such as turbulence and fire-spotting into the existing
wildfire simulators. The formulation proposes that the propagation of the
fire-front is the sum of a drifting component (obtained from an existing
wildfire simulator without turbulence and fire-spotting) and a random
fluctuating component. The modelling of the random effects is embodied in a
probability density function accounting for the fluctuations around the fire
perimeter which is given by the drifting component. In past, this formulation
has been applied to include these random effects into a wildfire simulator
based on an Eulerian moving interface method, namely the Level Set Method
(LSM), but in this paper the same formulation is adapted for a wildfire
simulator based on a Lagrangian front tracking technique, namely the Discrete
Event System Specification (DEVS). The main highlight of the present study is
the comparison of the performance of a Lagrangian and an Eulerian moving
interface method when applied to wild-land fire propagation. Simple idealised
numerical experiments are used to investigate the potential applicability of
the proposed formulation to DEVS and to compare its behaviour with respect to
the LSM. The results show that DEVS based wildfire propagation model
qualitatively improves its performance (e.g., reproducing flank and back fire,
increase in fire spread due to pre-heating of the fuel by hot air and
firebrands, fire propagation across no fuel zones, secondary fire generation,
\dots). Though the results presented here are devoid of any validation exercise
and provide only a proof of concept, they show a strong inclination towards an
intended operational use. The existing LSM or DEVS based operational simulators
like WRF-SFIRE and ForeFire respectively can serve as an ideal basis for the
same.Comment: Accepted for publication in Commun. Nonlinear Sci. Numer. Simula