Optical pulse replicators generate multiple replicas of an optical waveform that are averaged to increase the signal-to-noise ratio of single-shot, high-bandwidth temporal measurements. Processing a replicated waveform requires that the delayed realizations of the waveform under test be retimed properly before averaging. Delay miscalibration significantly reduces the measurement bandwidth. Processing algorithms based on edge alignment, centroid matching, and minimization of the distance between replicas decrease the impact of the detrimental bandwidth reduction, and a global distance minimization, simultaneously taking into account the distance between all pairs of retimed replicas, has the best performance, even in the presence of significant measurement noise. The general impact of chromatic dispersion on the averaged waveform is derived in the framework of the temporal transport-of-intensity equation, and the measurement error is quantified for various optical signals.
© 2013 USGovPDF Article