Laser interferometric gravitational-wave detectors implement Fabry-Pérot cavities to increase their peak sensitivity. However, this is at the cost of reducing their detection bandwidth, which originates from the propagation phase delay of the light. The "white-light-cavity" idea, first proposed by Wicht et al. [Opt. Commun. 34, 431 (1997)], is to circumvent this limitation by introducing anomalous dispersion, using a double-pumped gain medium, to compensate for such a phase delay. In this article, starting from the Hamiltonian of the atom-light interaction, we apply an input-output formalism to evaluate the quantum noise of the system. We find that apart from the additional noise associated with the parametric amplification process noted by others, the stability condition for the entire system poses an additional constraint. By surveying the parameter regimes where the gain medium remains stable (not lasing) and stationary, we find that there is no net enhancement of the shot-noise-limited sensitivity. Therefore, other gain media or different parameter regimes should be explored for realizing the white-light cavity.