Tensor dimensionality reduction (TDR) is a hot research topic in machine learning, which learns data representations by preserving the original data structure while avoiding convert samples into vectors and solving the problem of the curse of dimensionality of tensor data. In the work, a novel TDR approach based on mode product and Hilbert-Schmidt Independence criterion (HSIC) is proposed. The contributions of authors' work is described as following: (1) HSIC measures the statistical correlation of two random variables. However, instead of measuring the statistical correlation of two random variables directly, HSIC first transforms the two random variables into two reproducing kernel Hilbert spaces (RKHSs), and then measures the statistical correlation of transformed random variables by using Hilbert-Schmidt operators between the two RKHSs. The exploitation of RKHS increases the flexibility and applicability of HSIC. Although HSIC is widely used in machine learning, the authors have not seen its application to dimensionality reduction (DR)(except for authors' previous work). (2) A novel HSIC-based TDR approach is proposed, which first applies HSIC to capture statistical information of tensor data set for DR. The authors give the mathematical derivation of HSIC for tensor data and establish a framework of TDR based on HSIC, named HSIC-TDR for short, which aims to improve the DR results of tensor by exploring and preserving the statistical information of original data set. (3) Furthermore, to solve the out-of-sample problem, the authors learn an explicit expression between the dimensionality-reduced tensors and the higher-dimensional tensors by introducing mode product to HSIC-TDR. The experimental results between the proposed method and other state-of-the-art algorithm on various datasets demonstrate the well performance of the proposed method.This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.