With the growth of data, it is more important than ever to develop an efficient and robust method for solving the consistent matrix equation AX B = C. The randomized Kaczmarz (RK) method has received a lot of attention because of its computational efficiency and low memory footprint. A recently proposed approach is the matrix equation relaxed greedy RK (ME-RGRK) method, which greedily uses the loss of the index pair as a threshold to detect and avoid projecting the working rows onto that are too far from the current iterate. In this work, we utilize the Polyak's and Nesterov's momentums to further speed up the convergence rate of the ME-RGRK method. The resulting methods are shown to converge linearly to a least-squares solution with minimum Frobenius norm. Finally, some numerical experiments are provided to illustrate the feasibility and effectiveness of our proposed methods. In addition, a real-world application, i.e., tensor product surface fitting in computer-aided geometry design, has also been presented for explanatory purpose.