A seminal result of Hajnal and Szemerédi states that if a graph G with n vertices has minimum degree δ(G) ≥ (r − 1)n/r for some integer r ≥ 2, then G contains a K r -factor, assuming r divides n. Extremal examples which show optimality of the bound on δ(G) are very structured and, in particular, contain large independent sets. In analogy to the Ramsey-Túran theory, Balogh, Molla, and Sharifzadeh initiated the study of how the absence of such large independent sets influences sufficient minimum degree. We show the following two related results:• For any r > ℓ ≥ 2, if G is a graph satisfying δ(G) ≥ r−ℓ r−ℓ+1 n+Ω(n) and α ℓ (G) = o(n), that is, a largest K ℓ -free induced subgraph has at most o(n) vertices, then G contains a K r -factor. This is optimal for ℓ = r − 1 and extends a result of Balogh, Molla, and Sharifzadeh who considered the case r = 3.• If a graph G satisfies δ(G) = Ω(n) and α * r (G) = o(n), that is, every induced K r -free r-partite subgraph of G has at least one vertex class of size o(n), then it contains a K r -factor. A similar statement is proven for a general graph