We derive a model that optimizes the performance of a laser satellite communication link with an optical preamplifier in the presence of random jitter in the transmitter–receiver line of sight. The system utilizes a transceiver containing a single telescope with a circulator. The telescope is used for both transmitting and receiving and thus reduces communication terminal dimensions and weight. The optimization model was derived under the assumption that the dominant noise source was amplifier spontaneous-emission noise. It is shown that, given the required bit-error rate (BER) and the rms random pointing jitter, an optimal transceiver gain exists that minimizes transmitted power. We investigate the effect of the amplifier spontaneous-emission noise on the optimal transmitted power and gain by performing an optimization procedure for various combinations of amplifier gain and noise figure. We demonstrate that the amplifier noise figure determines the optimal transmitted power needed to achieve the desired BER but does not affect the optimal transceiver telescope gain. Our numerical example shows that for a BER of 10-9, doubling the amplifier noise figure results in an 80% increase in minimal transmitted power for a rms pointing jitter of 0.44 μrad.
© 2004 Optical Society of AmericaPDF Article