We analyze recent 2D simulations of the nonlinear evolution of the line-deshadowing instability (LDI) in hot-star winds, to quantify how the associated highly clumped density structure can lead to a "turbulent porosity" reduction in continuum absorption and/or scattering. The basic method is to examine the statistical variations of mass column as a function of path length, and fit these to analytic forms that lead to simple statistical scalings for the associated mean extinction. A key result is that one can characterize porosity effects on continuum transport in terms of a single "turbulent porosity length", found here to scale as H ≈ (f cl − 1)a, where f cl ≡ ρ 2 / ρ 2 is the clumping factor in density ρ, and a is the density autocorrelation length. For continuum absorption or scattering in an optically thick layer, we find the associated effective reduction in opacity scales as ∼ 1/ √ 1 + τ H , where τ H ≡ κρH is the local optical thickness of this porosity length. For these LDI simulations, the inferred porosity lengths are small, only about a couple percent of the stellar radius, H ≈ 0.02R * . For continuum processes like bound-free absorption of X-rays that are only marginally optically thick throughout the full stellar wind, this implies τ H 1, and thus that LDI-generated porosity should have little effect on X-ray transport in such winds. The formalism developed here could however be important for understanding the porous regulation of continuum-driven, super-Eddington outflows from luminous blue variables.