“…How to effectively exploit sparse vectors and matrices has been well-studied in the past for linear algebra problems [1,17,19,22,24,23,26,28,33,51,59,81,50,62,76,88]. The growing popularity of deep learning and big data has sparked a similar interest in studying how machine learning kernels can take advantage of sparse tensors [15,40,41,43,67,70,75].…”