Partial shading can significantly impair the efficiency of thin-film solar cells. When exposed to partial shading, cells within the array tend to become reverse biased, leading to thermal runaway events and the emergence of hotspots. In Cu(In,Ga)Se2 (CIGS) solar cells such hotspots are also associated with so-called worm-like defects. Both theoretical and experimental studies have shown that in CIGS, a positive-feedback loop leads to instability and thermal runaway events. However, we observe an inconsistency between published simulation results and recently published experimental work. In a recent experimental study, it was shown that under certain conditions, a hotspot develops within 1ms, showing signs of melting of the CIGS in an area with a 5μm radius. However, in published simulation results, the time for such high temperatures to develop is in the order of seconds, a discrepancy of three orders of magnitude. In this work, we argue that this discrepancy is explained by the size of the seed defect, demonstrating that the origin of these experimentally observed, fast-developing hotspots is likely microscopic defects. To this end, we developed an electro-thermal finite element model, with very high temporal and spatial resolution. We demonstrate that, assuming a seed defect with a 10nm radius, we can reproduce the experimental results with respect to the size of the defect and the time it took to develop.