E ven in public health's early stages, when John Snow first recognized the source of cholera's spread in London, it was clear that social determinants played a critical role in people's health: in this case, where people lived, worked, congregated, and accessed water determined their health outcomes. 1 Since then, research has consistently demonstrated that long-standing systemic inequities have put people from socially and economically disadvantaged groups at an increased risk for poorer health outcomes. [2][3][4] Despite such evidence, historically, public health professionals and institutions merely sought to mitigate health disparities primarily through targeted communications and culturally based interventions, suggesting simply that understanding was lacking at the individual level or that cultural variations were somehow leading to unhealthy outcomes. These approaches failed to fully recognize, much less address, the role of environment, community, and society's inherent structures and failings (eg, inequitable distribution of resources, racism, discrimination, and environmental injustice) in producing disparate health outcomes.In 1978, the World Health Organization (WHO) Alma Ata Declaration asserted that "health is a fundamental human right" and called for all governments,