Visible light communications (VLC) using the intensity modulation of light-emitting diodes (LEDs) provides a new communication medium to overcome the shortage of radio spectrum, and allows reuse of LED lighting infrastructures. Orthogonal frequency-division multiplexing (OFDM) was introduced to VLC for its merits in mitigating the fading effects resulting from delay spread, and in avoiding low-frequency ambient interference. Noise and clipping are two major factors that degrade the performance of OFDM in VLC. A larger signal easily overcomes noise, but experiences impairment by clipping. Therefore, degradation due to clipping has a trade-off relationship with that due to noise, depending on the signal amplitude of OFDM. In this paper, the optimal signal amplitude in the trade-off is obtained by simulation when the dimming and LED intensity are given. The former indicates a user's requirement for lighting, and the latter represents the channel quality. The required LED intensity-to-noise ratio, as the channel quality that guarantees dimming as well as an adequate bit-error rate (BER), is also discussed.