We present a new video stabilization technique that uses projective scene reconstruction to treat jittered video sequences. Unlike methods that recover the full three-dimensional geometry of the scene, this model accounts for simple geometric relations between points and epipolar lines. Using this level of scene understanding, we obtain the physical correctness of 3D stabilization methods yet avoid their lack of robustness and computational costs. Our method consists of tracking feature points in the scene and using them to compute fundamental matrices that model stabilized camera motion. We then project the tracked points onto the novel stabilized frames using epipolar point transfer and synthesize new frames using image-based frame warping. Since this model is only valid for static scenes, we develop a time-view reprojection that accounts for non-stationary points in a principled way. This reprojection is based on modeling the dynamics of smooth inertial object motion in three-dimensional space and allows us to avoid the need to interpolate stabilization for moving objects from their static surrounding. Thus, we achieve an adequate stabilization when both the camera and the objects are moving. We demonstrate the abilities of our approach to stabilize hand-held video shots in various scenarios: scenes with no parallax that challenge 3D approaches, scenes containing non-trivial parallax effects, videos with camera zooming and in-camera stabilization, as well as movies with large moving objects.