Radiation feedback from stellar clusters is expected to play a key role in setting the rate and efficiency of star formation in giant molecular clouds (GMCs). To investigate how radiation forces influence realistic turbulent systems, we have conducted a series of numerical simulations employing the Hyperion radiation hydrodynamics solver, considering the regime that is optically thick to ultraviolet (UV) and optically thin to infrared (IR) radiation. Our model clouds cover initial surface densities between Σ cl,0 ∼ 10 − 300 M pc −2 , with varying initial turbulence. We follow them through turbulent, self-gravitating collapse, formation of star clusters, and cloud dispersal by stellar radiation. All our models display a lognormal distribution of gas surface density Σ; for an initial virial parameter α vir,0 = 2, the lognormal standard deviation is σ lnΣ = 1 − 1.5 and the star formation rate coefficient ε ff,ρ = 0.3 − 0.5, both of which are sensitive to turbulence but not radiation feedback. The net star formation efficiency ε final increases with Σ cl,0 and decreases with α vir,0 . We interpret these results via a simple conceptual framework, whereby steady star formation increases the radiation force, such that local gas patches at successively higher Σ become unbound. Based on this formalism (with fixed σ lnΣ ), we provide an analytic upper bound on ε final , which is in good agreement with our numerical results. The final star formation efficiency depends on the distribution of Eddington ratios in the cloud and is strongly increased by turbulent compression of gas.