Modern militaries rely upon remote image sensors for real-time intelligence. A typical remote system consists of an unmanned aerial vehicle, or UAV, with an attached camera. A video stream is sent from the UAV, through a bandwidth-constrained satellite connection, to an intelligence processing unit. In this research, an upgrade to this remote-video-stream method of collection is proposed. A set of synthetic images of a scene captured by an UAV in a virtual environment is sent to a pipeline of computer vision algorithms, collectively known as Structure from Motion. The output of Structure from Motion, a three-dimensional (3D) model, is then assessed in a 3D virtual world as a possible replacement for the images from which it was created. This study shows Structure from Motion results from a modifiable spiral flight path and compares the geoaccuracy of each result. A flattening of height is observed, and an automated compensation for this flattening is proposed and performed. Each reconstruction is also compressed, and the size of the compression is compared with the compressed size of the images from which it was created. A reduction of 49–60% of required space, or bandwidth, is shown. A corresponding video demonstrating this technique is available online.