This work theoretically studies the transmission performance of a DML-based OFDM system by small-signal approximation, and the model considers both the transient and adiabatic chirps. The dispersion-induced distortion is modeled as subcarrier-to-subcarrier intermixing interference (SSII), and the theoretical SSII agrees with the distortion obtained from large-signal simulation statistically and deterministically. The analysis shows that the presence of the adiabatic chirp will ease power fading or even provide gain, but will increase the SSII to deteriorate OFDM signals after dispersive transmission. Furthermore, this work also proposes a novel iterative equalization to eliminate the SSII. From the simulation, the distortion could be effectively mitigated by the proposed equalization such that the maximum transmission distance of the DML-based OFDM signal is significantly improved. For instance, the transmission distance of a 30-Gbps DML-based OFDM signal can be extended from 10 km to more than 100 km. Besides, since the dispersion-induced distortion could be effectively mitigated by the equalization, negative power penalties are observed at some distances due to chirp-induced power gain.
©2012 Optical Society of America
OSA Recommended Articles
OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.
Alert me when this article is cited.
Equations on this page are rendered with MathJax. Learn more.