The effects of randomly varying birefringence on solitons are studied. It is shown analytically that the evolution equation can be reduced to the nonlinear Schrödinger equation if the variation length is much shorter than the soliton period. The soliton does not split at high values of the average birefringence, but it does undergo spreading and loss of polarization. A soliton with a temporally constant initial state of polarization is still largely polarized after 40z0 if the normalized birefringence is δ ≤ 1.3.
© 1991 Optical Society of America
Equations on this page are rendered with MathJax. Learn more.