Most proposed algorithmic fairness techniques require access to demographic data in order to make performance comparisons and standardizations across groups, however this data is largely unavailable in practice, hindering the widespread adoption of algorithmic fairness. Through this paper, we consider calls to collect more data on demographics to enable algorithmic fairness and challenge the notion that discrimination can be overcome with smart enough technical methods and sufficient data. We show how these techniques largely ignore broader questions of data governance and systemic oppression when categorizing individuals for the purpose of fairer algorithmic processing. In this work, we explore under what conditions demographic data should be collected and used to enable algorithmic fairness methods by characterizing a range of social risks to individuals and communities. For the risks to individuals we consider the unique privacy risks of sensitive attributes, the possible harms of miscategorization and misrepresentation, and the use of sensitive data beyond data subjects' expectations. Looking more broadly, the risks to entire groups and communities include the expansion of surveillance infrastructure in the name of fairness, misrepresenting and mischaracterizing what it means to be part of a demographic group, and ceding the ability to define what constitutes biased or unfair treatment. We argue that, by confronting these questions before and during the collection of demographic data, algorithmic fairness methods are more likely to actually mitigate harmful treatment disparities without reinforcing systems of oppression. Towards this end, we assess privacy-focused methods of data collection and use and participatory data governance structures as proposals for more responsibly collecting demographic data.CCS Concepts: • Security and privacy → Social aspects of security and privacy; Privacy protections; Economics of security and privacy; • Social and professional topics → User characteristics; Gender; Sexual orientation.