Touch is an important modality to recover object shape. We present a method for a robot to complete a partial shape model by local tactile exploration. In local tactile exploration, the finger is constrained to follow the local surface. This is useful for recovering information about a contiguous portion of the object and is frequently employed by humans. There are three contributions. First, we show how to segment an initial point cloud of a grasped, unknown object into hand and object. Second, we present a local tactile exploration planner. This combines a Gaussian Process (GP) model of the object surface with an AtlasRRT planner. The GP predicts the unexplored surface and the uncertainty of that prediction. The AtlasRRT creates a tactile exploration path across this predicted surface, driving it towards the region of greatest uncertainty. Finally, we experimentally compare the planner with alternatives in simulation, and demonstrate the complete approach on a real robot. We show that our planner successfully traverses the object, and that the full object shape can be recovered with a good degree of accuracy.