We show new scalar and matrix Chernoff-style concentration bounds for a broad class of probability distributions over {0, 1} n . Building on developments in high-dimensional expanders (Kaufman and Mass ITCS'17, Dinur and Kaufman FOCS'17, Kaufman and Oppenheim Combinatorica'20) and matroid theory (Adiprasito et al. Ann. Math.'18), a breakthrough result of Anari, Liu, Oveis and Vinzant (STOC '19) showed that the up-down random walk on matroid bases has polynomial mixing time -making it possible to efficiently sample from the (weighted) uniform distribution over matroid bases. Since then, there has been a flurry of related work proving polynomial mixing times for random walks used to sample from a wide range of discrete probability distributions. Many works have observed that as a corollary of their mixing time analysis, one can obtain scalar concentration for 1-Lipschitz functions of samples from the stable distribution via standard arguments that convert bounds on the modified log-Sobolev (MLS) or Poincaré constant of a random walk into concentration results for the associated stable distribution of the random walk, see for example Hermon and Salez (arXiv'19). Several recent works have considered a matrix analog of the Poincaré inequality for a random walk with an associated stable distribution. Using this matrix Poincaré inequality, these works have derived a concentration result for matrix-valued spectral norm-Lipschitz functions of samples from the distribution, see for example Auon et al. (Adv. Math.'20). Unfortunately, these bounds are weak in many important regimes.A recently developed strategy for analyzing up-down walks is based around a novel notion of spectral independence, which quantifies the dependence between variables in a distribution over {0, 1} n using the largest eigenvalue of an associated pairwise influence matrix I, see Anari et al. (FOCS'20). Many works on spectral independence have in fact bounded a stronger quantity I ∞→∞ ≥ λ max (I), which we call ℓ ∞ -independence. We show that any distribution over {0, 1} n which has bounded ℓ ∞ -independence satisfies a matrix Chernoff bound that in key regimes is much stronger than spectral norm-Lipschitz function concentration bounds derived from matrix Poincaré inequalities. Our bounds match the matrix Chernoff bound for independent random variables due to Tropp, which is the strongest known in many cases and is essentially tight for several key settings. For spectral graph sparsification, our matrix concentration results are exponentially stronger than those obtained from matrix Poincaré inequalities. Our matrix Chernoff bound is a broad generalization and strengthening of the matrix Chernoff bound of Kyng and Song (FOCS'18). Using our bound, we can conclude as a corollary that a union of O(log |V |) random spanning trees gives a spectral graph sparsifier of a graph with |V | vertices with high probability, matching results for independent edge sampling, and matching lower bounds from Kyng and Song. This improves on the O(log 2 |V |) spanning tr...