This work prepares new probability bounds for sums of random, independent, Hermitian tensors. These probability bounds characterize large-deviation behavior of the extreme eigenvalue of the sums of random tensors. We extend Laplace transform method and Lieb's concavity theorem from matrices to tensors, and apply these tools to generalize the classical bounds associated with the names Chernoff, Bennett, and Bernstein from the scalar to the tensor setting. Tail bounds for the norm of a sum of random rectangular tensors are also derived from corollaries of random Hermitian tensors cases. The proof mechanism can also be applied to tensor-valued martingales and tensor-based Azuma, Hoeffding and McDiarmid inequalities are established.