An expression is derived for the apparent log-amplitude variance that would result experimentally from the use of a high-frequency cutoff νe in the electronic detection system. It is shown, for the Kolmogorov spectrum and in the saturation regime, that the apparent log-amplitude variance decreases asymptotically as
is the log-amplitude variance obtained from the perturbation theory. Furthermore, it is shown that theories that suppress spatial frequencies of the order 1/ρ0(L), where ρ0(L) is the lateral coherence of the wave at propagation distance L, are equivalent to suppressing high temporal frequencies in the time domain. Thus, such theories will result in a predicted log-amplitude variance that decreases asymptotically with increasing values of
© 1974 Optical Society of America
Equations on this page are rendered with MathJax. Learn more.