The problem of incomplete data is common in signal processing and machine learning. Tensor completion algorithms aim to recover the incomplete data from its partially observed entries. In this paper, taking advantages of high compressibility and flexibility of recently proposed tensor ring (TR) decomposition, we propose a new tensor completion approach named tensor ring weighted optimization (TR-WOPT). It finds the latent factors of the incomplete tensor by gradient descent algorithm, then the latent factors are employed to predict the missing entries of the tensor. We conduct various tensor completion experiments on synthetic data and real-world data. The simulation results show that TR-WOPT performs well in various high-dimension tensors. Furthermore, image completion results show that our proposed algorithm outperforms the state-of-the-art algorithms in many situations. Especially when the missing rate of the test images is high (e.g., over 0.9), the performance of our TR-WOPT is significantly better than the compared algorithms.Corresponding authors: Jianting Cao (cao@sit.ac.jp) and Qibin Zhao (qibin.zhao@riken.jp).• Based on the recently proposed TR decomposition, we propose a new tensor completion algorithm named tensor ring weighted optimization (TR-WOPT).• The TR latent factors are optimized by gradient descent method and then they are used to predict the missing entries of the incomplete tensor.• We conduct several simulation experiments and realworld data experiments. The experiment results show that our method outperforms the state-of-the-art tensor completion algorithms in various situations. We also find T n 1 n k · · · n 2 = Z 1 Z k · · · Z 2 n 1 n k · · · n 2 r 2 r k r 3 Figure 2: A graphical representation of tensor ring decompos limited representation ability and flexibility; ii) TT-ranks are bounded by th matricization, which might not be optimal; iii) the permutation of data tensor wi solution, i.e., TT representations and TT-ranks are sensitive to the order of tens finding the optimal permutation remains a challenging problem.In this paper, we introduce a new structure of tensor networks, which ca generalization of TT representations. First of all, we relax the condition ove r d+1 = 1, leading to an enhanced representation ability. Secondly, the strict o products between cores should be alleviated. Third, the cores should be tr making the model symmetric. To this end, we add a new connection betwee core tensors, yielding a circular tensor products of a set of cores (see Fig. 2). consider that each tensor element is approximated by performing a trace operati multilinear products of cores. Since the trace operation ensures a scalar outp not necessary. In addition, the cores can be circularly shifted and treated eq properties of the trace operation. We call this model tensor ring (TR) decom tensor ring (TR) representations. To learn TR representations, we firstly de TR-SVD algorithm that is similar to TT-SVD algorithm (Oseledets, 2011). To TR-ranks, a block-wise ALS algorithms is presented. Final...