Given a set of synchronized video sequences of a dynamic scene taken by different cameras, we address the problem of creating a virtual video of the scene from a novel viewpoint. A key aspect of our algorithm is a method for recursively propagating dense and physically accurate correspondences between the two video sources. By exploiting temporal continuity and suitably constraining the correspondences, we provide an efficient framework for synthesizing realistic virtual video. The stability of the propagation algorithm is analyzed, and experimental results are presented.Index Terms-Correspondence, image-based rendering, view synthesis, virtual video, virtual views.