Abstract:We conduct a thorough comparison of two basic notch filters employed to mitigate the pattern effect that manifests when semiconductor optical amplifiers (SOAs) serve linear amplification purposes. The filters are implemented using as the building architecture the optical delay interferometer (ODI) and the microring resonator (MRR). We formulate and follow a rational procedure, which involves identifying and applying the appropriate conditions for the filters' spectral response slope related to the SOA pattern effect suppression mechanism. We thus extract the values of the free spectral range and detuning of each filter, which allow one to equivocally realize the pursued comparison. We define suitable performance metrics and obtain simulation results for each filter. The quantitative comparison reveals that most employed metrics are better with the MRR than with the ODI. Although the difference in performance is small, it is sufficient to justify considering also using the MRR for the intended purpose. Finally, we concisely discuss practical implementation issues of these notch filters and further make a qualitative comparison between them in terms of their inherent advantages and disadvantages. This discussion reveals that each scheme has distinct features that render it appropriate for supporting SOA direct signal amplification applications with a suppressed pattern effect.