In a soliton transmission system, spontaneous emission noise owing to optical amplifiers leads to timing jitter that is usually assumed to be Gaussian distributed. It is shown that the mutual interaction of solitons in neighboring time slots can lead to non-Gaussian tails on the distribution function and to a substantial increase in the bit-error rate. It is argued that the approach used here will also be of use in the study of non-return-to-zero systems.
© 1995 Optical Society of America
Equations on this page are rendered with MathJax. Learn more.