The deflationary account of representations purports to capture the explanatory role representations play in computational cognitive science. To this end, the account distinguishes between mathematical contents, representing the values and arguments of the functions cognitive devices compute, and cognitive contents, which represent the distal states of affairs cognitive systems relate to. Armed with this distinction, the deflationary account contends that computational cognitive science is committed only to mathematical contents, which are sufficient to provide satisfactory cognitive explanations. Here, I scrutinize the deflationary account, arguing that, as things stand, it faces two important challenges deeply connected with mathematical contents. The first depends on the fact that the deflationary account accepts that a satisfactory account of representations must deliver naturalized contents. Yet, mathematical contents have not been naturalized, and I claim that it is very doubtful that they ever will. The second challenge concerns the explanatory power of mathematical contents. The deflationary account holds that they are always sufficient to provide satisfactory explanations of cognitive phenomena. I will contend that this is not the case, as mathematical contents alone are not sufficient to explain why deep neural networks misclassify adversarial examples.