Close-range photogrammetry can be used to reconstruct dense point clouds of an object with very high surface coverage, making it useful for manufacturing metrology tasks such as part inspection and validation. However, compared to competing techniques, data processing times can be slow. In this paper we present a method to autonomously remove the background from the images within a photogrammetric dataset. We show that using masked images directly in the reconstruction results in much lower data processing times, with lower memory utilisation. Furthermore, we show that the point density on the object surface is increased while the number of superfluous background points is reduced. Finally, a set of reconstruction results are compared to a set of tactile coordinate measurements. Reconstructions with the background removed are shown to have a standard deviation in the point to mesh distance of up to 30 µm lower than if the background is not removed. This improvement in standard deviation is likely due to the static background, relative to the object on the rotation stage, causing triangulation errors when points are detected and matched on this background data. The proposed approach is shown to be robust over several example artefacts and can, therefore, be implemented to improve the measurement efficiency and measurement results of photogrammetry coordinate measurement systems.