The field-emission phenomenon is exploited in a broad variety of applications and systems. Previous studies have reported that the current induced by field emission strongly and inherently depend on the temperature. This dependence enhances the noise in the current, which results in performance degradation in, for example, signal detection and communications in nanoscale receivers. In this paper, a mathematical model is presented for the suppression of the noise based on its probability density. Our experiment and analysis revealed that the density follows a Gaussian distribution, and the dependence on temperature is observed to be exponential. This result is intriguing because in the field of signal processing and communication, the influence of temperature is often considered with a noise-temperature model, namely, linear dependence. Using our derived model, we theoretically evaluated the communication performance of a nanoscale receiver; owing to the exponential dependence on temperature, severe performance degradation was found with increasing temperature. This means that, as field-emission technology continues to be developed, the temperature should be kept low, for example, at room temperature, to secure the reliability of nanoscale communication devices.INDEX TERMS Bit-error rate, field emission, nanoscale communication, nonlinear temperature dependence.