Abstract

The chief emphasis of the paper is given to the calculation of the information capacity of a beam of light for on-off signal modulation. A preliminary and approximate calculation in Sec. 2 is based on Gaussian modulation. On-off modulation permits an exact calculation of the information capacity; the calculation is based on the Poisson statistics of nondegenerate ensembles of Bose entities.

Most of the results are given in terms of the information capacity per transmitted photon, measured in bits per photon, and called the information efficiency It. When there is no ambient light, and when the probability p that the gate is open is equal to one-half, the maximum possible information efficiency is just one bit per transmitted photon; when, however, the probability p of an open gate is less than one-half, the information efficiency It may be higher, with the upper bound log2(1/p).

The calculation of the information efficiency in Sec. 3 supposes that no light reaches the receiver except that from the gated source. In Sec. 4 ambient light also is permitted to reach the receiver; extensive numerical results were obtained with a high-speed digital computer and are presented in tables and graphs.

The term light is used throughout the paper instead of radiation, in order to emphasize that the results hold only for nondegenerate beams of radiation. The output of laser oscillators is therefore excluded.

© 1962 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. R. Clark Jones, J. Opt. Soc. Am. 50, 1166 (1960).
    [Crossref]
  2. S. Goldman, Information Theory (Prentice-Hall, Inc., Englewood Cliffs, New Jersey, 1953), pp. 47–49.
  3. E. C. Molina, Poisson’s Exponential Binomial Limit (D. Van Nostrand Company, Inc., Princeton, New Jersey, 1942).
  4. L. Mandel, J. Opt. Soc. Am. 51, 797 (1961).
    [Crossref]
  5. K. Shimoda, H. Takahasi, and C. H. Townes, J. Phys. Soc. Japan 12, 686 (1957).
    [Crossref]
  6. Donald E. Savage, “Information Capacity of Channels Using Intensity Modulation of Light or X-rays,” Technical Note RADC–TN–60–209, December, 1960, Rome Air Development Center, Griffiss Air Force Base, New York.

1961 (1)

1960 (1)

1957 (1)

K. Shimoda, H. Takahasi, and C. H. Townes, J. Phys. Soc. Japan 12, 686 (1957).
[Crossref]

Clark Jones, R.

Goldman, S.

S. Goldman, Information Theory (Prentice-Hall, Inc., Englewood Cliffs, New Jersey, 1953), pp. 47–49.

Mandel, L.

Molina, E. C.

E. C. Molina, Poisson’s Exponential Binomial Limit (D. Van Nostrand Company, Inc., Princeton, New Jersey, 1942).

Savage, Donald E.

Donald E. Savage, “Information Capacity of Channels Using Intensity Modulation of Light or X-rays,” Technical Note RADC–TN–60–209, December, 1960, Rome Air Development Center, Griffiss Air Force Base, New York.

Shimoda, K.

K. Shimoda, H. Takahasi, and C. H. Townes, J. Phys. Soc. Japan 12, 686 (1957).
[Crossref]

Takahasi, H.

K. Shimoda, H. Takahasi, and C. H. Townes, J. Phys. Soc. Japan 12, 686 (1957).
[Crossref]

Townes, C. H.

K. Shimoda, H. Takahasi, and C. H. Townes, J. Phys. Soc. Japan 12, 686 (1957).
[Crossref]

J. Opt. Soc. Am. (2)

J. Phys. Soc. Japan (1)

K. Shimoda, H. Takahasi, and C. H. Townes, J. Phys. Soc. Japan 12, 686 (1957).
[Crossref]

Other (3)

Donald E. Savage, “Information Capacity of Channels Using Intensity Modulation of Light or X-rays,” Technical Note RADC–TN–60–209, December, 1960, Rome Air Development Center, Griffiss Air Force Base, New York.

S. Goldman, Information Theory (Prentice-Hall, Inc., Englewood Cliffs, New Jersey, 1953), pp. 47–49.

E. C. Molina, Poisson’s Exponential Binomial Limit (D. Van Nostrand Company, Inc., Princeton, New Jersey, 1942).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1

A graphical representation of the probabilities of transmitting a “zero” or “one” (on the left) and of receiving a “zero” or “one” (on the right). The figure is for the special case in which no photons may reach the receiver, except those that are radiated by the source. One notes that when a “zero” is transmitted, a “zero” is always received, since when the gate is closed there are no photons at all. But when a “one” is transmitted, there is a conditional probability eM that no photons are received, where M is the mean number of photons received when a “one” is transmitted. This plot is of substantial assistance in writing the probabilities (3.1) and (3.2).

Fig. 2
Fig. 2

All of the data in this plot are for symmetrical on-off modulation: p = 1 2. The upper plot shows the information efficiency It in bits per photon plotted vs the number of signal photons M. The curve A = 0 is based on Eq. (3.4) and the data in Table I. The short curves labeled A = 0.5, A = 1.0, A = 10 and A = 30 show the maximum value found in Table III for the given values of A and M. The open circles indicate the maximum of these curves. The lower plot shows that C approaches a maximum value of one bit per digit as M becomes infinite. The curves in this plot suggest the simple approximate formulas given by Eqs. (3.12) to (3.16).

Fig. 3
Fig. 3

A graphical representation of the probabilities for the special case in which there is steady ambient radiation (mean number of photons per gate period = A) in addition to the signal radiation from the gated source. The Poisson sum S(A,R) is the conditional probability of receiving a “one” when a “zero” is transmitted, and the Poisson sum S(A + M, R) is the conditional probability of receiving a “one” when a “one” is transmitted; the complements of these P’s are the conditional probability of receiving a “zero” when “zero” or “one” is transmitted. This plot is a help in writing down the probabilities (1.3) and (4.4). Unlike Fig. 1, in this plot, the probability of receiving a “zero” (or a “one”) is represented in two separate places, and the total probability is the sum of the two separate contributions.

Fig. 4
Fig. 4

A plot vs the number A of ambient photons, of the absolute maximum value of the information efficiency It in bits per photon for the given values of the probability p of transmitting a “one,” and of the mean number A of ambient photons per gate period. The curves are based on the data in Table II.

Fig. 5
Fig. 5

The four plots are all for symmetrical on-off modulation: p = 1 2, and from left to right are for ambient photon numbers A equal to 0.5, 1.0, 10, and 30. Within each plot the information efficiency It in bits per photon is plotted versus the number of signal photons M for three different thresholds R of the receiver; the middle of the three values of R is the value that yields the absolute maximum of It for the given values of p and A. The data in the plots are tabulated in Table III. Note that the abscissa and the ordinate are given independently for each plot.

Fig. 6
Fig. 6

The four plots are all for unsymmetrical on-off modulation for which the probability of an open gate is p = 0.001, and for which the probability of a closed gate is therefore q = 0.999. The data in the plots are tabulated in Table V. Otherwise the caption for Fig. 5 is the same for this plot also.

Tables (5)

Tables Icon

Table I Information capacity C and the information efficiency It as a function of the mean number of signal photons M, for the special case of symmetrical on-off modulation, p = 1 2.

Tables Icon

Table II Absolute maximum of the information efficiency It in bits per photon for the given values of the probability p of an open gate, and of the mean number A of ambient photons per gate period. Also the values of the detector threshold R (smallest number of received photons required for a decision of “one”) and the mean number M of signal photons that produce the absolute maximum of R. The values of Q / p and T are also shown.

Tables Icon

Table III Information efficiency It in bits per transmitted photon. M is the mean number of signal photons per gate period when the gate is open, and R is the smallest number of received photons required for a “one” decision. All of the data are for p = 0.5, and for the indicated mean number A of ambient photons per gate period. For each value of A, the absolute maximum value of It is shown in italic type. The italicized values are also given in Table II and are plotted in Fig. 4. The data in this table are plotted in Fig. 5.

Tables Icon

Table IV Same description as for Table III, except that p = 0.1, and the data are not plotted, except in Fig. 4.

Tables Icon

Table V Same description as for Table III, except p = 0.001, and the data are plotted in Figs. 4 and 6.

Equations (33)

Equations on this page are rendered with MathJax. Learn more.

γ = 1 / ( e h ν / k T s 1 ) .
λ T s 14 400 μ -deg,
N ( M M ¯ ) 2 av = M ¯ .
S = K 2 M ¯ 2 .
C = log 2 ( 1 + S / N ) 1 2 ,
C = 1 2 log 2 ( 1 + K 2 M ¯ ) ,
I = C / M ¯ = ( 1 / 2 M ¯ ) log 2 ( 1 + K 2 M ¯ ) ,
I = [ 1 2 log 2 e ] K 2 = 0.72 K 2
N = M ¯ + A ,
I = ( 1 / 2 M ¯ ) log 2 [ 1 + K 2 M ¯ 2 / ( M ¯ + A ) ] .
P 0 = q P 1 = p P 0 = 1 p ( 1 e M ) P 1 = p ( 1 e M ) .
p 00 = 1 p 01 = 0 p 10 = e M p 11 = 1 e M .
C = i P i j p i j log 2 p i j j P j log 2 P j .
C = p M log 2 p 1 + ( 1 p M ) log 2 ( 1 p M ) 1 μ p M e M ,
M 1 e M
C = p M log 2 ( 1 / p ) .
I s = C / M = p log 2 ( 1 / p ) .
max re p of I s = e 1 log 2 e = 0.53 bit per photon .
I s = 0.5 bit per photon
I t = C / p M = log 2 1 / p .
I t = 1.0 bit per photon .
I t 1 / ( 1 + M 2 / 4 ) 1 2 , p = 1 2 ,
C 1 2 M / ( 1 + M 2 / 4 ) 1 2 , p = 1 2 .
I t log 2 p 1 / ( 1 + M 2 / M c 2 ) 1 2
C p M log 2 p 1 / ( 1 + M 2 / M c 2 ) 1 2 ,
M c = 2 , for p = 1 2 M c = 1 + ( log 2 e ) / ( log 2 p 1 ) , for p 1 1 M c = 1 , for log 2 p 1 1.
U ( B , W ) = e B B W / W !
S ( B , R ) = W = R e B B W / W ! .
P 0 = 1 p P 1 = p P 0 = p { 1 S ( A + M , R ) } + ( 1 p ) { 1 S ( A , R ) } P 1 = p S ( A + M , R ) + ( 1 p ) S ( A , R )
p 00 = 1 S ( A , R ) p 01 = S ( A , R ) p 10 = 1 S ( A + M , R ) p 11 = S ( A + M , R ) .
C = F [ p T + ( 1 p ) Q ] + F [ p ( 1 T ) + ( 1 p ) ( 1 Q ) ] p ( F [ T ] + F [ 1 T ] ) ( 1 p ) ( F [ Q ] + F [ 1 Q ] ) ,
Q S ( A , R ) T S ( A + M , R ) F [ X ] X log 2 ( 1 / X ) .
I t = C / p M .