The attenuation of light in star forming galaxies is correlated with a multitude of physical parameters including star formation rate, metallicity and total dust content. This variation in attenuation is even more prevalent on the kiloparsec scale, which is relevant to many current spectroscopic integral field unit surveys. To understand the cause of this variation, we present and analyse Swift/UVOT near-UV (NUV) images and SDSS/MaNGA emission-line maps of 29 nearby (z < 0.084) star forming galaxies. We resolve kiloparsec-sized star forming regions within the galaxies and compare their optical nebular attenuation (i.e., the Balmer emission line optical depth, τ l B ≡ τ Hβ − τ Hα ) and NUV stellar continuum attenuation (via the NUV power-law index, β) to the attenuation law described by Battisti et al. The data agree with that model, albeit with significant scatter. We explore the dependence of the scatter of the β-τ l B measurements from the star forming regions on different physical parameters, including distance from the nucleus, star formation rate and total dust content. Finally, we compare the measured τ l B and β between the individual star forming regions and the integrated galaxy light. We find a strong variation in β between the kiloparsec scale and the larger galaxy scale not seen in τ l B . We conclude that the sight-line dependence of UV attenuation and the reddening of β due to the light from older stellar populations could contribute to the β-τ l B discrepancy.