This paper discusses the investigation of maximum-likelihood sequence estimation (MLSE) receivers operating on intensity-modulated direct-detection optical channels. The study focuses on long-haul or metro links spanning several hundred kilometers of single-mode fiber with optical amplifiers. The structure of MLSE-based optical receivers operating in the presence of dispersion and amplified spontaneous emission (ASE), as well as shot and thermal noise, are discussed, and a theory of the error rate of these receivers is developed. Computer simulations show a close agreement between the predictions of the theory and simulation results. Some important implementation issues are also addressed. Optical channels suffer from impairments that set them apart from other channels, and therefore they need a special investigation. Among these impairments are the facts that the optical channel is nonlinear, and noise is often non-Gaussian and signal dependent. For example, in optically amplified single-mode fiber links, the dominant source of noise is ASE noise, which after photodetection is distributed according to a noncentral chi-square probability density function. In addition, optical fibers suffer from chromatic and polarization-mode dispersion (PMD). Although the use of MLSE in optical channels has been discussed in previous literature, no detailed analysis of optical receivers using this technique has been reported so far. This motivates the study reported in this paper.
© 2005 IEEEPDF Article