Abstract
Recent demonstrations of lightwave systems operating at bit rates as high as 16 Gbit/s (Ref. 1) and over optical amplifier based spans hundreds of kilometers long2 have renewed interest in evaluating polarization-mode dispersion as a potential limitation in high bit rate systems. For such applications it is important to consider random polarization-mode coupling caused by internally or externally derived perturbations on the fiber birefringence. One consequence of such mode coupling is that the temporal response of a birefringent fiber can be altered by changes in ambient temperature owing to the sensitivity of mode coupling to the relative phase of the modes at a given perturbation in the fiber.3 We show that, because of such sensitivity to the environment, polarization-mode dispersion can lead to fading of the baseband signal in high-bit-rate lightwave systems in many ways analogous to multipath fading in radio systems. Such fading affects both coherent and direct detection systems.
© 1990 Optical Society of America
PDF ArticleMore Like This
C. D. POOLE, J. H. WINTERS, and J. A. NAGEL
CWI4 Conference on Lasers and Electro-Optics (CLEO:S&I) 1990
Daniele Alzetta and Masayuki Matsumoto
TuI3 Optical Fiber Communication Conference (OFC) 2002
C. D. Poole and T. E. Darcie
CWF5 Conference on Lasers and Electro-Optics (CLEO:S&I) 1993