Abstract

In this paper, we present an experimental and numerical study of semiconductor optical amplifier (SOA)-based noise suppression and its relevance to high-channel-density spectrum-sliced wavelength-division-multiplexed systems. We show that the improvement in signal quality is accompanied by spectral distortion, which renders it susceptible to deterioration in the presence of subsequent optical filtering. This phenomenon originates from the loss of intensity correlation between spectral components of the SOA output when the signal spectrum is altered. As a consequence, a design tradeoff is introduced between intensity noise and crosstalk in high-channel-density systems. These adverse effects can be overcome by optimized SOA design, resulting in a significant improvement in signal quality.

© 2005 IEEE

PDF Article

References

You do not have subscription access to this journal. Citation lists with outbound citation links are available to subscribers only. You may subscribe either as an OSA member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access OSA Member Subscription

Cited By

You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an OSA member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access OSA Member Subscription