Existing tensor completion methods all require some hyperparameters. However, these hyperparameters determine the performance of each method, and it is difficult to tune them. In this paper, we propose a novel nonparametric tensor completion method, which formulates tensor completion as an unconstrained optimization problem and designs an efficient iterative method to solve it. In each iteration, we not only calculate the missing entries by the aid of data correlation, but consider the low-rank of tensor and the convergence speed of iteration. Our iteration is based on the gradient descent method, and approximates the gradient descent direction with tensor matricization and singular value decomposition. Considering the symmetry of every dimension of a tensor, the optimal unfolding direction in each iteration may be different. So we select the optimal unfolding direction by scaled latent nuclear norm in each iteration. Moreover, we design formula for the iteration step-size based on the nonconvex penalty. During the iterative process, we store the tensor in sparsity and adopt the power method to compute the maximum singular value quickly. The experiments of image inpainting and link prediction show that our method is competitive with six state-of-the-art methods.which is similar to Tucker decomposition. In the second approach, Nuclear Norm Minimization (NNM) is often used to replace RM because RM is NP-hard [14]. Reference [15] directly minimized the tensor nuclear norm for tensor completion. Reference [16] proposed a dual frame for low-rank tensor completion via nuclear norm constraints. Since high-order tensors represent a higher dimensional space, some works [17,18] used the Riemannian manifold for tensor completion, which is still closely linked to RM. In addition, some works [19,20] converted the tensor into matrices and realized tensor completion by means of matrix completion, but they ignored the inner structure and correlation of the data.The aforementioned methods of tensor completion all require some hyperparameters, such as the upper bound of the rank in the low-rank constraint and the penalty coefficient for norms. However, the selection of these hyperparameters not only consumes a substantial amount of time but also determines the performance of the methods. To address this issue, we propose a novel Nonparametric Tensor Completion (NTC) method based on gradient descent and nonconvex penalty. We use gradient descent to solve the optimization problem of tensor completion and build a gradient tensor with tensor matricizations and Singular Value Decomposition (SVD). We select the optimal direction based on the scaled latent nuclear norm in each iteration. The step-size in gradient descent is regarded as a penalty parameter for the singular value, and we design a nonconvex penalty for it. Furthermore, during the iterative process, we store the tensor in sparsity and adopt the power method to compute the maximum singular value quickly. Experiments of image inpainting and link prediction show that our metho...