The inspection of condition of underwater pipelines (UPs) based on autonomous underwater vehicles (AUVs) requires high accuracy of positioning while the AUV is moving along to the object being examined. Currently, acoustic, magnetometric, and visual means are used to detect and track UPs with AUVs. Compared to other methods, visual navigation can provide higher accuracy for local maneuvering at short distances to the object. According to the authors of the present article, the potential of video information for these purposes is not yet fully utilized, and, therefore, the study focused on the more efficient use of stereo images taken with an AUV’s video camera. For this, a new method has been developed to address inspection challenges, which consists in the highlighting of visible boundaries and the calculation of the UP centerline using algorithms for combined processing of 2D and 3D video data. Three techniques for initial recognition of the direction of UP upon its detection were analyzed: on the basis of a stereo-pair of images using point features of the surface; using tangent planes to the UP in one of the stereo-pair; and using the UP median planes in both images of the stereo-pair. Approaches for determining the parameters of the relative positions of the AUV and the UP during the subsequent tracking are also considered. The technology proposed can be of practical use in the development of navigation systems to be applied for UP inspection without deploying additional expensive equipment, either separately or in combination with measurements from other sensors.