Abstract

Ultra-wide-band analog-to-digital (A/D) conversion is one of the most critical problems faced in communication, instrumentation, and radar systems. This paper presents a comprehensive analysis of the recently proposed time-stretched A/D converter. By reducing the signal bandwidth prior to digitization,this technique offers revolutionary enhancements in the performance of electronic converters. The paper starts with a fundamental-physics analysis of the time-wavelength transformation and the implication of time dilation on the signal-to-noise ratio. A detailed mathematical description of the time-stretch process is then constructed. It elucidates the influence of linear and nonlinear optical dispersion on the fidelity of the electrical signal. Design issues of a single-sideband time-stretch system, as they relate to broad-band operation, are examined. Problems arising from the nonuniform optical power spectral density are explained,and two methods for overcoming them are described. As proof of the concept,120 GSa/s real-time digitization of a 20-GHz signal is demonstrated. Finally,design issues and performance features of a continuous-time time-stretch system are discussed.

© 2003 IEEE

PDF Article

References

You do not have subscription access to this journal. Citation lists with outbound citation links are available to subscribers only. You may subscribe either as an OSA member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access OSA Member Subscription

Cited By

You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an OSA member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access OSA Member Subscription