We investigate the complexity and utility of performing Multiple Bit Upset (MBU) vulnerability analysis in modern microprocessors. While the Single Bit Flip (SBF) model constitutes the prevailing mechanism for capturing the effect of Single Event Upsets (SEUs) due to alpha particle or neutron strikes in semiconductors, recent radiation studies in 90nm and 65nm technology nodes demonstrate that up to 55% of such strikes result in Multiple Bit Upsets (MBUs). Consequently, the accuracy of popular vulnerability analysis methods, such as the Architecural Vulnerability Factor (AVF) and Failures In Time (FIT) rate estimates based on the SBF assumption comes into question, especially in modern microprocessors which contain a significant amount of memory elements. Towards alleviating this concern, we present an extensive infrastructure which enables MBU vulnerability analysis in modern microprocessors. Using this infrastructure and a modern microprocessor model, we perform a large scale MBU vulnerability analysis study and we report two key findings: (i) the SBF fault model overestimates vulnerability by up to 71%, as compared to a more realistic modeling and distribution of faults in the 90nm and 65nm processes, and (ii) the rank-ordered lists of critical bits, as computed through the SBF and MBU models, respectively, are very similar, as indicated by the average rank difference of a bit which is less than 1.45%.