Publishing in highly ranked disciplinary journals plays a critical role in career advancement. Yet, the process through which journals are classified as top-tier is largely unexamined in the social work literature. To better understand the utility of various methods for determining journal quality, we compare three basic approaches to ranking disciplinary journals: reputation, h-index values, and impact factors (IFs). More specifically, we compare faculty perceptions of social work journals in 2019 with faculty perceptions in 2000, Google Scholar h-index rankings from 2010, and Clarivate Analytics' IFs from 2008 and 2017. Method: To create a current, reputation-based ranking of disciplinarily periodicals, a national sample of tenure-track faculty (N 5 307) evaluated the overall quality and prestige of social work periodicals (N 5 64). We obtained prior faculty perceptions of quality and prestige from Sellers et al. (2004), h-index values from Hodge and Lacasse (2011), and 2008 and 2017 IFs from Clarivate Analytics' Web of Science portal. Results: Faculty perceptions of quality exhibited a relatively strong correlation with faculty perceptions in 2000 (r s 5 .76), suggesting that faculty perceptions regarding journal quality are relatively stable across decades. Among the citationbased approaches, the 2010 Google Scholar h-index values exhibited the strongest correlation with current faculty perceptions (r s 5 .81), and the 2017 IFs had the lowest correlation (r s 5 .48). Conclusions: The results provide some guidance to disciplinary stakeholders making assessments about top-tier journals. For instance, the relative stability of faculty perceptions enables scholars to have some confidence that journals that are currently perceived as top-tier will remain so in the future. Results also raise questions about the utility of relying upon IFs in assessments of journal quality.