Near distances are overestimated in virtual reality, and far distances are underestimated, but an explanation for these distortions remains elusive. One potential concern is that whilst the eye rotates to look at the virtual scene, the virtual cameras remain static. Could using eye-tracking to change the perspective of the virtual cameras as the eye rotates improve depth perception in virtual reality? This paper identifies 14 distinct perspective distortions that could in theory occur from keeping the virtual cameras fixed whilst the eye rotates in the context of near-eye displays. However, the impact of eye movements on the displayed image depends on the optical, rather than physical, distance of the display. Since the optical distance of most head-mounted displays is over 1m, most of these distortions will have only a negligible effect. The exception are ‘gaze-contingent disparities’, which will leave near virtual objects looking displaced from physical objects that are meant to be at the same distance in augmented reality.