This article explores the power-sensitivity trade-off in optical receivers aiming to improve the energy-efficiency of the overall link. Optical receivers with field-effect transistor (FET) front-ends (FEs) are usually designed for optimal noise performance by matching the circuit's input capacitance (𝐶 𝐼 ) to the total input parasitic capacitance (𝐶 𝐷 ). However, the receiver's power dissipation is also proportional to the input capacitance 𝐶 𝐼 . Therefore, this paper studies the feasibility of the capacitive matching rule in the context of minimizing the power dissipation of the overall link. For that purpose, design trade-offs for the receiver, transmitter, and the overall link are presented. Comparisons are made to study how much the receiver can be downsized, sacrificing optimal noise performance, before its power reduction is offset by the transmitter's increase in power. Simulation results show that energy-efficient links require low-power receivers with input capacitance much smaller than that required for noise-optimum performance. As an example, for a 25 Gb/s operation, an optical loss budget of 12.6 dB, and a receiver designed in 65 nm CMOS technology with 𝐶 𝐷 of 200 fF, the overall link dissipates 2.55 pJ/bit when the receiver's noise is minimized, leading to a receiver with 𝐶 𝐼 /𝐶 𝐷 = 1.29. When optimized for overall link efficiency, the receiver size is significantly reduced to 𝐶 𝐼 /𝐶 𝐷 = 0.38 and the link's energy-efficiency also improves to 1.41 pJ/bit. If the link budget or knowledge of the transmitter side is incomplete, our analysis indicates that maximizing gain with value of 𝐶 𝐼 /𝐶 𝐷 = 0.5 is a reasonable choice.INDEX TERMS Laser driver, link budget, main amplifier, transimpedance amplifier, VCSEL.