The detailed analysis of measured interferograms generally requires phase correction. Phase-shift correction methods are commonly used and well documented for conventional Fourier-transform spectroscopy. However, measured interferograms can show additional phase errors, depending on the optical path difference and signal frequency, which we call phase distortion. In spatial heterodyne spectroscopy they can be caused, for instance, by optical defects or image distortions, making them a characteristic of the individual spectrometer. They can generally be corrected without significant loss of the signal-to-noise ratio. We present a technique to measure phase distortion by using a measured example interferogram. We also describe a technique to correct for phase distortion and test its performance by using a simulation with a near-UV solar spectrum. We find that for our measured example interferogram the phase distortion is small and nearly frequency independent. Furthermore, we show that the presented phase-correction technique is especially effective for apodized interferograms.
© 2004 Optical Society of AmericaFull Article | PDF Article
John M. Harlander, Fred L. Roesler, Joel G. Cardon, Christoph R. Englert, and Robert R. Conway
Appl. Opt. 41(7) 1343-1352 (2002)
Christoph R. Englert and John M. Harlander
Appl. Opt. 45(19) 4583-4590 (2006)
Gretchen Keppel-Aleks, Geoffrey C. Toon, Paul O. Wennberg, and Nicholas M. Deutscher
Appl. Opt. 46(21) 4774-4779 (2007)