This study takes a sociology of quantification approach in exploring the impact of ‘commensurative’ processes in data journalism, in which distinct incidents and entities are rendered similar, aggregated, and shaped into elaborate abstract constructs. This literature emphasizes the political-economic contexts of data production and predicts a heavy reliance on government data, use of national over local data, and a tendency to take data categories for granted, with inconsistent scrutiny. A content analysis of data journalism projects at legacy and non-legacy outlets reveals some support for the expectations from this literature. Findings show an increasing tendency to portray events as abstract metrics and decreasing attention to personal, lived anecdotes. Findings also show a growing tendency to provide indeterminate data sources and limited overt and accessible evidence of data scrutiny. We also see a higher percentage of national-level sourcing than local sourcing across all years, and a decline in government sourcing coupled with a rise in self-gathered, crowdsourced data online. Legacy outlets were more likely than non-legacy outlets to use local data sources, to provide anecdotal reporting in connection with the data presentation, and to use government data sources. Non-legacy outlets were more likely to produce complex abstract metrics.