Nonsmooth dynamics driven by stochastic disturbance arise in a wide variety of engineering problems. Impulsive interventions are often employed to control stochastic systems; however, the modeling and analysis subject to execution delay have been less explored. In addition, continuously receiving information of the dynamics is not always possible. In this article, with an application to an environmental restoration problem, a continuous-time stochastic impulse control problem subject to execution delay under discrete and random observations is newly formulated and analyzed. The dynamics have a nonsmooth coefficient modulated by a Markov chain, and eventually attain an undesirable state like a depletion due to the nonsmoothness. The goal of the control problem is to find the most cost-efficient policy that can prevent the dynamics from attaining the undesirable state. We demonstrate that finding the optimal policy reduces to solving a nonstandard system of degenerate elliptic equations, the Hamilton-Jacobi-Bellman equation (HJBE), which is rigorously and analytically verified in a simplified case. The associated Fokker-Planck equation (FPE) for the controlled dynamics is derived and solved explicitly as well. The model is finally applied to numerical computation of a recent river environmental restoration problem. The HJBE and FPE are successfully computed, and the optimal policy and the probability density functions are numerically obtained.The impacts of execution delay are discussed to deeper analyze the model.