A simulation study of inferential analysis under data overload was conducted with professional intelligence analysts. Using a process tracing methodology, patterns in information sampling and sources of inaccurate statements were identified when analysts were asked to analyze something outside their base of expertise, were tasked with a tight deadline, and had a large data set. The main contribution from this study is a better understanding of potential vulnerabilities in inferential analysis in challenging situations. These vulnerabilities are informative because they point to a set of design criteria that human-centered solutions to data overload should meet in order to be useful. These evaluation criteria are interesting, in part, because they are so difficult to address. They are not amenable to simple, straightforward adjustments or feature additions to current tools. Meeting these design criteria will require innovative design concepts.