In this work, we examine the optimality of Gaussian signalling for covert communications with an upper bound on D(p 1 ||p 0 ) or D(p 0 ||p 1 ) as the covertness constraint, where D(p 1 ||p 0 ) and D(p 0 ||p 1 ) are different due to the asymmetry of Kullback-Leibler divergence, p 0 (y) and p 1 (y) are the likelihood functions of the observation y at the warden under the null hypothesis (no covert transmission) and alternative hypothesis (a covert transmission occurs), respectively. Considering additive white Gaussian noise at both the receiver and the warden, we prove that Gaussian signalling is optimal in terms of maximizing the mutual information of transmitted and received signals for covert communications with an upper bound on D(p 1 ||p 0 ) as the constraint. More interestingly, we also prove that Gaussian signalling is not optimal for covert communications with an upper bound on D(p 0 ||p 1 ) as the constraint, for which as we explicitly show skew-normal signalling can outperform Gaussian signalling in terms of achieving higher mutual information. Finally, we prove that, for Gaussian signalling, an upper bound on D(p 1 ||p 0 ) is a tighter covertness constraint in terms of leading to lower mutual information than the same upper bound on D(p 0 ||p 1 ), by proving D(p 0 ||p 1 ) ≤ D(p 1 ||p 0 ).