Abstract

A model suitable for analyzing the nonlinear interaction between signal and noise, mediated by the Kerr effect in optical communication systems, is presented. This model treats separately signal and noise and permits analysis of the symbols' central time position and frequency evolution. It is shown that this nonlinear interaction between signal and noise leads to symbols' random frequency shifts, which induce timing jitter in all types of systems. We also discuss the problem of estimating timing jitter for a signal embedded in noise.

© 2008 IEEE

PDF Article

References

You do not have subscription access to this journal. Citation lists with outbound citation links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Cited By

You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription