This paper critiques the new mechanistic explanatory program on grounds that, even when applied to the kinds of examples that it was originally designed to treat, it does not distinguish correct explanations from those that blunder. First, I offer a systematization of the explanatory account, one according to which explanations are mechanistic models that satisfy three desiderata: they must 1) represent causal relations, 2) describe the proper parts, and 3) depict the system at the right 'level.' Second, I argue that even the most developed attempts to fulfill these desiderata fall short by failing to appropriately constrain explanatorily apt mechanistic models.