G. Buchsbaum, A. Gottschalk, “Trichromacy, opponent colour coding and optimum information transmission in the retina,” Proc. R. Soc. London Ser. B 220, 89–113 (1983).

[CrossRef]

M. V. Srinivasan, S. B. Laughlin, A. Dubs, “Predictive coding: a fresh view of lateral inhibition,” Proc. R. Soc. London Ser. B 216, 427–459 (1982).

[CrossRef]

B. Sakitt, H. B. Barlow, “A model for the economical encoding of the visual image in cerebral cortex,” Biol. Cybern. 43, 97–108 (1982).

[CrossRef]
[PubMed]

S. Laughlin, “A simple coding procedure enhances a neuron’s information capacity,”Z. Naturforsch. 36, 910–912 (1981).

A. N. Netravali, J. O. Limb, “Picture coding: a review,” Proc. IEEE 68, 336–406 (1980).

T. M. Cover, R. C. King, “A convergent gambling estimate of the entropy of English,”IEEE Trans. Inf. Theory IT-24, 413–421 (1978).

[CrossRef]

J. G. Simon, “Redondance et complexité,” Psychol. Belg. 12, 255–263 (1972).

J. V. Gaven, J. Tavitian, A. Harabedian, “The informative value of sampled images as a function of the number of gray levels used in encoding the images,” Photogr. Sci. Eng. 14, 16–20 (1970).

N. S. Tzannes, R. V. Spencer, A. J. Kaplan, “On estimating the entropy of random fields,” Inf. Control 16, 1–6 (1970).

[CrossRef]

J. R. Parks, “Prediction and entropy of half-tone pictures,” Behav. Sci. 10, 436–445 (1965).

[CrossRef]
[PubMed]

W. F. Schreiber, “The measurement of third order probability distributions of television signals,”IRE Trans. Inf. Theory, IT-2, 94–105 (1956). Schreiber measured third-order statistics, based on a raster scan, for a 6-bit/pixel image that gave a third-order entropy of 1.5 bits/pixel or redundancy of 75%. This picture was described as the least complicated picture. The average third-order redundancy for the 4-bit/pixel images measured here was 44%. It is difficult to compare these results directly because they depend on both the picture complexity and the quantization. However, estimates of entropy as a function of the degree of quantization typically show a decrease in redundancy as the number of levels increases.22

[CrossRef]

F. Attneave, “Informational aspects of visual perception,” Psych. Rev. 61, 183–193 (1954).

[CrossRef]

C. E. Shannon, “Prediction and entropy of printed English,” Bell. Syst. Tech. J. 30, 50–64 (1951).

H. C. Andrews, Introduction to Mathematical Techniques in Pattern Recognition (Krieger, Malabar, Fla., 1983).

F. Attneave, “Informational aspects of visual perception,” Psych. Rev. 61, 183–193 (1954).

[CrossRef]

B. Sakitt, H. B. Barlow, “A model for the economical encoding of the visual image in cerebral cortex,” Biol. Cybern. 43, 97–108 (1982).

[CrossRef]
[PubMed]

H. B. Barlow, “Sensory mechanisms, the reduction of redundancy, and intelligence,” in Proceedings of the National Physical Laboratory Symposium on the Mechanization of Thought Processes (H. M. Stationery Office, London, 1959), No. 10, pp. 535–539.

H. B. Barlow, “Perception: what quantitative laws govern the acquisition of knowledge from the senses?” in Functions of the Brain, Wolfson College Lectures, C. Cohen, ed. (Clarendon, Oxford, 1985), pp. 11–43.

G. Buchsbaum, A. Gottschalk, “Trichromacy, opponent colour coding and optimum information transmission in the retina,” Proc. R. Soc. London Ser. B 220, 89–113 (1983).

[CrossRef]

G. W. Cottrell, P. Munro, D. Zipser, “Image compression by back propagation: an example of extensional programming,” (University of California, San Diego, La Jolla, Calif., 1987).

T. M. Cover, R. C. King, “A convergent gambling estimate of the entropy of English,”IEEE Trans. Inf. Theory IT-24, 413–421 (1978).

[CrossRef]

M. V. Srinivasan, S. B. Laughlin, A. Dubs, “Predictive coding: a fresh view of lateral inhibition,” Proc. R. Soc. London Ser. B 216, 427–459 (1982).

[CrossRef]

J. V. Gaven, J. Tavitian, A. Harabedian, “The informative value of sampled images as a function of the number of gray levels used in encoding the images,” Photogr. Sci. Eng. 14, 16–20 (1970).

R. C. Gonzalezand, P. Wintz, Digital Image Processing (Addison-Wesley, Reading, Mass., 1977).

G. Buchsbaum, A. Gottschalk, “Trichromacy, opponent colour coding and optimum information transmission in the retina,” Proc. R. Soc. London Ser. B 220, 89–113 (1983).

[CrossRef]

J. V. Gaven, J. Tavitian, A. Harabedian, “The informative value of sampled images as a function of the number of gray levels used in encoding the images,” Photogr. Sci. Eng. 14, 16–20 (1970).

N. S. Tzannes, R. V. Spencer, A. J. Kaplan, “On estimating the entropy of random fields,” Inf. Control 16, 1–6 (1970).

[CrossRef]

T. M. Cover, R. C. King, “A convergent gambling estimate of the entropy of English,”IEEE Trans. Inf. Theory IT-24, 413–421 (1978).

[CrossRef]

S. Laughlin, “A simple coding procedure enhances a neuron’s information capacity,”Z. Naturforsch. 36, 910–912 (1981).

M. V. Srinivasan, S. B. Laughlin, A. Dubs, “Predictive coding: a fresh view of lateral inhibition,” Proc. R. Soc. London Ser. B 216, 427–459 (1982).

[CrossRef]

A. N. Netravali, J. O. Limb, “Picture coding: a review,” Proc. IEEE 68, 336–406 (1980).

B. B. Mandelbrot, Fractals: Form, Chance and Dimension (Freeman, San Francisco, 1977).

G. W. Cottrell, P. Munro, D. Zipser, “Image compression by back propagation: an example of extensional programming,” (University of California, San Diego, La Jolla, Calif., 1987).

A. N. Netravali, J. O. Limb, “Picture coding: a review,” Proc. IEEE 68, 336–406 (1980).

J. R. Parks, “Prediction and entropy of half-tone pictures,” Behav. Sci. 10, 436–445 (1965).

[CrossRef]
[PubMed]

W. K. Pratt, Digital Image Processing (Wiley, New York, 1978).

W. A. Richards, “A lightness scale from image intensity distributions,” Artificial Intelligence Memo 648 (Massachusetts Institute of Technology, Cambridge, Mass., 1981).

B. Sakitt, H. B. Barlow, “A model for the economical encoding of the visual image in cerebral cortex,” Biol. Cybern. 43, 97–108 (1982).

[CrossRef]
[PubMed]

E. Saund, “Dimensionality-reduction using connectionist networks,” Artificial Intelligence Memo 941 (Massachusetts Institute of Technology, Cambridge, Mass., 1987).

W. F. Schreiber, “The measurement of third order probability distributions of television signals,”IRE Trans. Inf. Theory, IT-2, 94–105 (1956). Schreiber measured third-order statistics, based on a raster scan, for a 6-bit/pixel image that gave a third-order entropy of 1.5 bits/pixel or redundancy of 75%. This picture was described as the least complicated picture. The average third-order redundancy for the 4-bit/pixel images measured here was 44%. It is difficult to compare these results directly because they depend on both the picture complexity and the quantization. However, estimates of entropy as a function of the degree of quantization typically show a decrease in redundancy as the number of levels increases.22

[CrossRef]

C. E. Shannon, “Prediction and entropy of printed English,” Bell. Syst. Tech. J. 30, 50–64 (1951).

The entropy for the pseudofractal Gaussian image was calculated by using a discrete form of Shannon’s formula for the entropy of a continuous Guassian source. [See C. E. Shannon, “A mathematical theory of communication,” Bell Syst. Tech. J.27, 371–428 (1948)].

J. G. Simon, “Redondance et complexité,” Psychol. Belg. 12, 255–263 (1972).

N. S. Tzannes, R. V. Spencer, A. J. Kaplan, “On estimating the entropy of random fields,” Inf. Control 16, 1–6 (1970).

[CrossRef]

M. V. Srinivasan, S. B. Laughlin, A. Dubs, “Predictive coding: a fresh view of lateral inhibition,” Proc. R. Soc. London Ser. B 216, 427–459 (1982).

[CrossRef]

J. V. Gaven, J. Tavitian, A. Harabedian, “The informative value of sampled images as a function of the number of gray levels used in encoding the images,” Photogr. Sci. Eng. 14, 16–20 (1970).

N. S. Tzannes, R. V. Spencer, A. J. Kaplan, “On estimating the entropy of random fields,” Inf. Control 16, 1–6 (1970).

[CrossRef]

R. C. Gonzalezand, P. Wintz, Digital Image Processing (Addison-Wesley, Reading, Mass., 1977).

G. W. Cottrell, P. Munro, D. Zipser, “Image compression by back propagation: an example of extensional programming,” (University of California, San Diego, La Jolla, Calif., 1987).

J. R. Parks, “Prediction and entropy of half-tone pictures,” Behav. Sci. 10, 436–445 (1965).

[CrossRef]
[PubMed]

C. E. Shannon, “Prediction and entropy of printed English,” Bell. Syst. Tech. J. 30, 50–64 (1951).

B. Sakitt, H. B. Barlow, “A model for the economical encoding of the visual image in cerebral cortex,” Biol. Cybern. 43, 97–108 (1982).

[CrossRef]
[PubMed]

T. M. Cover, R. C. King, “A convergent gambling estimate of the entropy of English,”IEEE Trans. Inf. Theory IT-24, 413–421 (1978).

[CrossRef]

N. S. Tzannes, R. V. Spencer, A. J. Kaplan, “On estimating the entropy of random fields,” Inf. Control 16, 1–6 (1970).

[CrossRef]

W. F. Schreiber, “The measurement of third order probability distributions of television signals,”IRE Trans. Inf. Theory, IT-2, 94–105 (1956). Schreiber measured third-order statistics, based on a raster scan, for a 6-bit/pixel image that gave a third-order entropy of 1.5 bits/pixel or redundancy of 75%. This picture was described as the least complicated picture. The average third-order redundancy for the 4-bit/pixel images measured here was 44%. It is difficult to compare these results directly because they depend on both the picture complexity and the quantization. However, estimates of entropy as a function of the degree of quantization typically show a decrease in redundancy as the number of levels increases.22

[CrossRef]

W. S. Geisler, D. B. Hamilton, “Sampling-theory analysis of spatial vision,” J. Opt. Soc. Am. A 3, 62–70 (1986).

[CrossRef]
[PubMed]

A. B. Watson, “Efficiency of a model human image code,” J. Opt. Soc. Am. A 4, 2401–2417 (1987).

[CrossRef]
[PubMed]

L. Sirovich, M. Kirby, “Low-dimensional procedure for the characterization of human faces,” J. Opt. Soc. Am. A 4, 519–524 (1987).

[CrossRef]
[PubMed]

J. V. Gaven, J. Tavitian, A. Harabedian, “The informative value of sampled images as a function of the number of gray levels used in encoding the images,” Photogr. Sci. Eng. 14, 16–20 (1970).

A. N. Netravali, J. O. Limb, “Picture coding: a review,” Proc. IEEE 68, 336–406 (1980).

G. Buchsbaum, A. Gottschalk, “Trichromacy, opponent colour coding and optimum information transmission in the retina,” Proc. R. Soc. London Ser. B 220, 89–113 (1983).

[CrossRef]

M. V. Srinivasan, S. B. Laughlin, A. Dubs, “Predictive coding: a fresh view of lateral inhibition,” Proc. R. Soc. London Ser. B 216, 427–459 (1982).

[CrossRef]

F. Attneave, “Informational aspects of visual perception,” Psych. Rev. 61, 183–193 (1954).

[CrossRef]

J. G. Simon, “Redondance et complexité,” Psychol. Belg. 12, 255–263 (1972).

S. Laughlin, “A simple coding procedure enhances a neuron’s information capacity,”Z. Naturforsch. 36, 910–912 (1981).

H. B. Barlow, “Sensory mechanisms, the reduction of redundancy, and intelligence,” in Proceedings of the National Physical Laboratory Symposium on the Mechanization of Thought Processes (H. M. Stationery Office, London, 1959), No. 10, pp. 535–539.

H. B. Barlow, “Perception: what quantitative laws govern the acquisition of knowledge from the senses?” in Functions of the Brain, Wolfson College Lectures, C. Cohen, ed. (Clarendon, Oxford, 1985), pp. 11–43.

W. A. Richards, “A lightness scale from image intensity distributions,” Artificial Intelligence Memo 648 (Massachusetts Institute of Technology, Cambridge, Mass., 1981).

E. Saund, “Dimensionality-reduction using connectionist networks,” Artificial Intelligence Memo 941 (Massachusetts Institute of Technology, Cambridge, Mass., 1987).

G. W. Cottrell, P. Munro, D. Zipser, “Image compression by back propagation: an example of extensional programming,” (University of California, San Diego, La Jolla, Calif., 1987).

The experiment was run with an IBM/XT with a Number Nine Revolution graphics board, a Microsoft Mouse, and a Tektronix GMA 301 display. The programs were written in Lattice C with the halo subroutine library. Pictures were digitized with an 8-bit/pixel Chorus Data Systems digitizer and an RCA camera.

B. B. Mandelbrot, Fractals: Form, Chance and Dimension (Freeman, San Francisco, 1977).

If the gray levels are uniformly distributed, then the histogram for the required number of correct guesses for each gray level is uniformly distributed as well.

The entropy for the pseudofractal Gaussian image was calculated by using a discrete form of Shannon’s formula for the entropy of a continuous Guassian source. [See C. E. Shannon, “A mathematical theory of communication,” Bell Syst. Tech. J.27, 371–428 (1948)].

Relative entropy is Fn/m. The median nearest-neighbor predictor gave upper bounds of relative entropy, which increased by an average of 0.56 for each 1-bit/pixel increase in quantization from 2 to 6 bits/pixel, for leaves, four elders, and hbb. The relative entropy bound leveled off between 6 and 7 bits/pixel, probably because of the degradation in the original imaging process.

An eye’s required capacity can be estimated roughly as follows. The photon flux at 555 nm, corresponding to 1 Td, is about 1.25 × 106 photons sec−1deg−2. If we assume a receptor diameter of about 0.008 deg, then 10 nits for 33 msec with a 3-mm pupil corresponds to about 150 photons. We assume an effective discretization of levels according to Poisson statistics, which separates levels by 1 standard deviation. There are then about 17 levels, or 4 bits, going from 150 to 15 photons. A 1 deg × 1 deg area divided by the receptor area is 125 × 125 pixels. The actual required capacity would be somewhat less than 125 × 125 × 4 bits, because of a nonrectangular modulation transfer function and nonuniform receptor spacing.

If i, j, and k represent the gray levels of three consecutive pixels, the third-order conditional entropy, F(k|i, j), can be estimated from the relative frequencies of sequences of ijk and ij. If these frequencies are denoted by p(i, j, k) and p(i, j), respectively, then the third-order conditional entropy can be calculated from the difference between the third-order and second-order joint entropy:-∑i,j,kp(i,j,k)log2 p(i,j,k)+∑i,jp(i,j)log2p(i,j).For comparison with nearest-neighbor predictors, it would be interesting to know the eighth-order entropies, based on the eight nearest neighbors. However, this would require gathering data. on 168-dimensional histograms. This is prohibitive because of the number of samples, and hence the number of pictures, that would need to be measured and because of the computational demands.

W. K. Pratt, Digital Image Processing (Wiley, New York, 1978).

H. C. Andrews, Introduction to Mathematical Techniques in Pattern Recognition (Krieger, Malabar, Fla., 1983).

R. C. Gonzalezand, P. Wintz, Digital Image Processing (Addison-Wesley, Reading, Mass., 1977).