The performance of lasercom systems operating in the atmosphere is reduced by optical turbulence, which causes irradiance fluctuations in the received signal. The result is a randomly fading signal. Fade statistics obtained from experimental data were compared to theoretical predictions based on the lognormal and gamma–gamma distributions. The probability of fade, the expected number of fades per second, and the mean fade time were calculated from the irradiance fluctuations of a Gaussian beam wave propagating through the atmosphere along a horizontal path, near ground, in the moderate-to-strong optical turbulence regime. Irradiance data were collected simultaneously at three receiving apertures, each with a different size. Atmospheric propagation parameters were inferred from the measurements and were used in calculations for the theoretical distributions. It was found that fade predictions made by the gamma–gamma and lognormal distributions provide an upper and lower bound, respectively, for the probability of fade and the number of fades per second for the irradiance data collected in the moderate-to-strong fluctuation regime. What is believed to be a new integral expression for the expected number of fades based on the gamma–gamma distribution was developed. This new expression tracked the gamma–gamma distributed data more closely than the existing approximation and resulted in a higher number of fades.
© 2007 Optical Society of AmericaFull Article | PDF Article
Stanley M. Flatté, Charles Bracher, and Guang-Yu Wang
J. Opt. Soc. Am. A 11(7) 2080-2092 (1994)
Ronald L. Phillips and Larry C. Andrews
J. Opt. Soc. Am. 72(7) 864-870 (1982)
A. Tiker, N. Yarkoni, N. Blaunstein, A. Zilberman, and N. Kopeika
Appl. Opt. 46(2) 190-199 (2007)