The gate length of CMOS transistors is continuing to shrink down to the sub-10nm region and operating voltages are moving toward near-threshold and even sub-threshold values. With this trend, the number of electrons responsible for the total charge of a CMOS node is greatly reduced. As a consequence, thermal fluctuations that shift a gate from its equilibrium point may no longer have a negligible impact on circuit reliability. Time-domain analysis helps understand how transient faults affect a circuit and can guide designers in producing noise-resistant circuitry. However, modeling thermal noise in the time-domain is computationally very costly. Moreover, small fluctuations in electron occupation introduce time-varying biasing point fluctuations, increasing the modeling complexity. To address these challenges, this paper introduces a new approach to modeling thermal noise directly in the time domain by developing a series of stochastic differential equations (SDE) to model various transient effects in the presence of thermal noise. In comparisons to SPICE-based simulations, our approach can provide 3 orders of magnitude speedup in simulation time, with comparable accuracy. This simulation framework is especially valuable for detecting rare events that could translate into fault-inducing noise transients. While it is computationally infeasible to use SPICE to detect such rare events due to thermal noise, we introduce a new iterative approach that allows detecting 6σ events in a matter of a few hours.