We show theoretically that the incorporation of a frequency-dependent loss mechanism in a semiconductor laser can lead, in concert with the amplitude-to-phase coupling, to major reductions of the fundamental intensity and phase noise. A loss dispersion of the wrong sign, on the other hand, leads to an increase of the noise and, at a certain strength, to instability. The optical field is taken aswhere 5(t) and OM(t) are the noise-driven amplitude and phase excursions, respectively. The loss is represented by a frequency-dependent photon lifetime Tp,Tp rpo (2) with rnm the average frequency and xi ( 3 ) and Xr(3) the imaginary and real part of the third-order amplifying medium susceptibility so that a is the amplitude-tophase coupling factor. A formal solution of Eqs. (3) with 6(0) = 0, q(0) = 0 leads toW' WI (1 + Ca), This is the key ensatz. It stipulates that the loss rate where Ai and Ar are, respectively, the real and imaginary parts of the Langevin noise sources representing the spontaneous emission, and (5) (6) (7) where co' is seen to play the role of the fundamental decay rate of the fluctuations. This rate increases for Ca >O. Taking first the case of col'> O we can neglect the decaying terms, i.e., terms with cl'(t -X), in the exponent of Eq. (6)
A~k
Xr(3)°1---