Abstract

One aspect of human image understanding is the ability to estimate missing parts of a natural image. This ability depends on the redundancy of the representation used to describe the class of images. In 1951, Shannon [ Bell. Syst. Tech. J. 30, 50 ( 1951)] showed how to estimate bounds on the entropy and redundancy of an information source from predictability data. The entropy, in turn, gives a measure of the limits to error-free information compaction. An experiment was devised in which human observers interactively restored missing gray levels from 128 × 128 pixel pictures with 16 gray levels. For eight images, the redundancy ranged from 46%, for a complicated picture of foliage, to 74%, for a picture of a face. For almost-complete pictures, but not for noisy pictures, this performance can be matched by a nearest-neighbor predictor.

© 1987 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. F. Attneave, “Informational aspects of visual perception,” Psych. Rev. 61, 183–193 (1954).
    [CrossRef]
  2. H. B. Barlow, “Sensory mechanisms, the reduction of redundancy, and intelligence,” in Proceedings of the National Physical Laboratory Symposium on the Mechanization of Thought Processes (H. M. Stationery Office, London, 1959), No. 10, pp. 535–539.
  3. H. B. Barlow, “Perception: what quantitative laws govern the acquisition of knowledge from the senses?” in Functions of the Brain, Wolfson College Lectures, C. Cohen, ed. (Clarendon, Oxford, 1985), pp. 11–43.
  4. A. N. Netravali, J. O. Limb, “Picture coding: a review,” Proc. IEEE 68, 336–406 (1980).
  5. G. Buchsbaum, A. Gottschalk, “Trichromacy, opponent colour coding and optimum information transmission in the retina,” Proc. R. Soc. London Ser. B 220, 89–113 (1983).
    [CrossRef]
  6. S. Laughlin, “A simple coding procedure enhances a neuron’s information capacity,”Z. Naturforsch. 36, 910–912 (1981).
  7. M. V. Srinivasan, S. B. Laughlin, A. Dubs, “Predictive coding: a fresh view of lateral inhibition,” Proc. R. Soc. London Ser. B 216, 427–459 (1982).
    [CrossRef]
  8. W. A. Richards, “A lightness scale from image intensity distributions,” Artificial Intelligence Memo 648 (Massachusetts Institute of Technology, Cambridge, Mass., 1981).
  9. L. Sirovich, M. Kirby, “Low-dimensional procedure for the characterization of human faces,” J. Opt. Soc. Am. A 4, 519–524 (1987).
    [CrossRef] [PubMed]
  10. E. Saund, “Dimensionality-reduction using connectionist networks,” Artificial Intelligence Memo 941 (Massachusetts Institute of Technology, Cambridge, Mass., 1987).
  11. G. W. Cottrell, P. Munro, D. Zipser, “Image compression by back propagation: an example of extensional programming,” (University of California, San Diego, La Jolla, Calif., 1987).
  12. C. E. Shannon, “Prediction and entropy of printed English,” Bell. Syst. Tech. J. 30, 50–64 (1951).
  13. T. M. Cover, R. C. King, “A convergent gambling estimate of the entropy of English,”IEEE Trans. Inf. Theory IT-24, 413–421 (1978).
    [CrossRef]
  14. J. R. Parks, “Prediction and entropy of half-tone pictures,” Behav. Sci. 10, 436–445 (1965).
    [CrossRef] [PubMed]
  15. N. S. Tzannes, R. V. Spencer, A. J. Kaplan, “On estimating the entropy of random fields,” Inf. Control 16, 1–6 (1970).
    [CrossRef]
  16. J. G. Simon, “Redondance et complexité,” Psychol. Belg. 12, 255–263 (1972).
  17. The experiment was run with an IBM/XT with a Number Nine Revolution graphics board, a Microsoft Mouse, and a Tektronix GMA 301 display. The programs were written in Lattice C with the halo subroutine library. Pictures were digitized with an 8-bit/pixel Chorus Data Systems digitizer and an RCA camera.
  18. B. B. Mandelbrot, Fractals: Form, Chance and Dimension (Freeman, San Francisco, 1977).
  19. J. V. Gaven, J. Tavitian, A. Harabedian, “The informative value of sampled images as a function of the number of gray levels used in encoding the images,” Photogr. Sci. Eng. 14, 16–20 (1970).
  20. If the gray levels are uniformly distributed, then the histogram for the required number of correct guesses for each gray level is uniformly distributed as well.
  21. The entropy for the pseudofractal Gaussian image was calculated by using a discrete form of Shannon’s formula for the entropy of a continuous Guassian source. [See C. E. Shannon, “A mathematical theory of communication,” Bell Syst. Tech. J.27, 371–428 (1948)].
  22. Relative entropy is Fn/m. The median nearest-neighbor predictor gave upper bounds of relative entropy, which increased by an average of 0.56 for each 1-bit/pixel increase in quantization from 2 to 6 bits/pixel, for leaves, four elders, and hbb. The relative entropy bound leveled off between 6 and 7 bits/pixel, probably because of the degradation in the original imaging process.
  23. An eye’s required capacity can be estimated roughly as follows. The photon flux at 555 nm, corresponding to 1 Td, is about 1.25 × 106 photons sec−1deg−2. If we assume a receptor diameter of about 0.008 deg, then 10 nits for 33 msec with a 3-mm pupil corresponds to about 150 photons. We assume an effective discretization of levels according to Poisson statistics, which separates levels by 1 standard deviation. There are then about 17 levels, or 4 bits, going from 150 to 15 photons. A 1 deg × 1 deg area divided by the receptor area is 125 × 125 pixels. The actual required capacity would be somewhat less than 125 × 125 × 4 bits, because of a nonrectangular modulation transfer function and nonuniform receptor spacing.
  24. If i, j, and k represent the gray levels of three consecutive pixels, the third-order conditional entropy, F(k|i, j), can be estimated from the relative frequencies of sequences of ijk and ij. If these frequencies are denoted by p(i, j, k) and p(i, j), respectively, then the third-order conditional entropy can be calculated from the difference between the third-order and second-order joint entropy:-∑i,j,kp(i,j,k)log2 p(i,j,k)+∑i,jp(i,j)log2p(i,j).For comparison with nearest-neighbor predictors, it would be interesting to know the eighth-order entropies, based on the eight nearest neighbors. However, this would require gathering data. on 168-dimensional histograms. This is prohibitive because of the number of samples, and hence the number of pictures, that would need to be measured and because of the computational demands.
  25. W. F. Schreiber, “The measurement of third order probability distributions of television signals,”IRE Trans. Inf. Theory, IT-2, 94–105 (1956). Schreiber measured third-order statistics, based on a raster scan, for a 6-bit/pixel image that gave a third-order entropy of 1.5 bits/pixel or redundancy of 75%. This picture was described as the least complicated picture. The average third-order redundancy for the 4-bit/pixel images measured here was 44%. It is difficult to compare these results directly because they depend on both the picture complexity and the quantization. However, estimates of entropy as a function of the degree of quantization typically show a decrease in redundancy as the number of levels increases.22
    [CrossRef]
  26. W. K. Pratt, Digital Image Processing (Wiley, New York, 1978).
  27. H. C. Andrews, Introduction to Mathematical Techniques in Pattern Recognition (Krieger, Malabar, Fla., 1983).
  28. R. C. Gonzalezand, P. Wintz, Digital Image Processing (Addison-Wesley, Reading, Mass., 1977).
  29. B. Sakitt, H. B. Barlow, “A model for the economical encoding of the visual image in cerebral cortex,” Biol. Cybern. 43, 97–108 (1982).
    [CrossRef] [PubMed]
  30. W. S. Geisler, D. B. Hamilton, “Sampling-theory analysis of spatial vision,” J. Opt. Soc. Am. A 3, 62–70 (1986).
    [CrossRef] [PubMed]
  31. A. B. Watson, “Efficiency of a model human image code,” J. Opt. Soc. Am. A 4, 2401–2417 (1987).
    [CrossRef] [PubMed]

1987 (2)

1986 (1)

1983 (1)

G. Buchsbaum, A. Gottschalk, “Trichromacy, opponent colour coding and optimum information transmission in the retina,” Proc. R. Soc. London Ser. B 220, 89–113 (1983).
[CrossRef]

1982 (2)

M. V. Srinivasan, S. B. Laughlin, A. Dubs, “Predictive coding: a fresh view of lateral inhibition,” Proc. R. Soc. London Ser. B 216, 427–459 (1982).
[CrossRef]

B. Sakitt, H. B. Barlow, “A model for the economical encoding of the visual image in cerebral cortex,” Biol. Cybern. 43, 97–108 (1982).
[CrossRef] [PubMed]

1981 (1)

S. Laughlin, “A simple coding procedure enhances a neuron’s information capacity,”Z. Naturforsch. 36, 910–912 (1981).

1980 (1)

A. N. Netravali, J. O. Limb, “Picture coding: a review,” Proc. IEEE 68, 336–406 (1980).

1978 (1)

T. M. Cover, R. C. King, “A convergent gambling estimate of the entropy of English,”IEEE Trans. Inf. Theory IT-24, 413–421 (1978).
[CrossRef]

1972 (1)

J. G. Simon, “Redondance et complexité,” Psychol. Belg. 12, 255–263 (1972).

1970 (2)

J. V. Gaven, J. Tavitian, A. Harabedian, “The informative value of sampled images as a function of the number of gray levels used in encoding the images,” Photogr. Sci. Eng. 14, 16–20 (1970).

N. S. Tzannes, R. V. Spencer, A. J. Kaplan, “On estimating the entropy of random fields,” Inf. Control 16, 1–6 (1970).
[CrossRef]

1965 (1)

J. R. Parks, “Prediction and entropy of half-tone pictures,” Behav. Sci. 10, 436–445 (1965).
[CrossRef] [PubMed]

1956 (1)

W. F. Schreiber, “The measurement of third order probability distributions of television signals,”IRE Trans. Inf. Theory, IT-2, 94–105 (1956). Schreiber measured third-order statistics, based on a raster scan, for a 6-bit/pixel image that gave a third-order entropy of 1.5 bits/pixel or redundancy of 75%. This picture was described as the least complicated picture. The average third-order redundancy for the 4-bit/pixel images measured here was 44%. It is difficult to compare these results directly because they depend on both the picture complexity and the quantization. However, estimates of entropy as a function of the degree of quantization typically show a decrease in redundancy as the number of levels increases.22
[CrossRef]

1954 (1)

F. Attneave, “Informational aspects of visual perception,” Psych. Rev. 61, 183–193 (1954).
[CrossRef]

1951 (1)

C. E. Shannon, “Prediction and entropy of printed English,” Bell. Syst. Tech. J. 30, 50–64 (1951).

Andrews, H. C.

H. C. Andrews, Introduction to Mathematical Techniques in Pattern Recognition (Krieger, Malabar, Fla., 1983).

Attneave, F.

F. Attneave, “Informational aspects of visual perception,” Psych. Rev. 61, 183–193 (1954).
[CrossRef]

Barlow, H. B.

B. Sakitt, H. B. Barlow, “A model for the economical encoding of the visual image in cerebral cortex,” Biol. Cybern. 43, 97–108 (1982).
[CrossRef] [PubMed]

H. B. Barlow, “Sensory mechanisms, the reduction of redundancy, and intelligence,” in Proceedings of the National Physical Laboratory Symposium on the Mechanization of Thought Processes (H. M. Stationery Office, London, 1959), No. 10, pp. 535–539.

H. B. Barlow, “Perception: what quantitative laws govern the acquisition of knowledge from the senses?” in Functions of the Brain, Wolfson College Lectures, C. Cohen, ed. (Clarendon, Oxford, 1985), pp. 11–43.

Buchsbaum, G.

G. Buchsbaum, A. Gottschalk, “Trichromacy, opponent colour coding and optimum information transmission in the retina,” Proc. R. Soc. London Ser. B 220, 89–113 (1983).
[CrossRef]

Cottrell, G. W.

G. W. Cottrell, P. Munro, D. Zipser, “Image compression by back propagation: an example of extensional programming,” (University of California, San Diego, La Jolla, Calif., 1987).

Cover, T. M.

T. M. Cover, R. C. King, “A convergent gambling estimate of the entropy of English,”IEEE Trans. Inf. Theory IT-24, 413–421 (1978).
[CrossRef]

Dubs, A.

M. V. Srinivasan, S. B. Laughlin, A. Dubs, “Predictive coding: a fresh view of lateral inhibition,” Proc. R. Soc. London Ser. B 216, 427–459 (1982).
[CrossRef]

Gaven, J. V.

J. V. Gaven, J. Tavitian, A. Harabedian, “The informative value of sampled images as a function of the number of gray levels used in encoding the images,” Photogr. Sci. Eng. 14, 16–20 (1970).

Geisler, W. S.

Gonzalezand, R. C.

R. C. Gonzalezand, P. Wintz, Digital Image Processing (Addison-Wesley, Reading, Mass., 1977).

Gottschalk, A.

G. Buchsbaum, A. Gottschalk, “Trichromacy, opponent colour coding and optimum information transmission in the retina,” Proc. R. Soc. London Ser. B 220, 89–113 (1983).
[CrossRef]

Hamilton, D. B.

Harabedian, A.

J. V. Gaven, J. Tavitian, A. Harabedian, “The informative value of sampled images as a function of the number of gray levels used in encoding the images,” Photogr. Sci. Eng. 14, 16–20 (1970).

Kaplan, A. J.

N. S. Tzannes, R. V. Spencer, A. J. Kaplan, “On estimating the entropy of random fields,” Inf. Control 16, 1–6 (1970).
[CrossRef]

King, R. C.

T. M. Cover, R. C. King, “A convergent gambling estimate of the entropy of English,”IEEE Trans. Inf. Theory IT-24, 413–421 (1978).
[CrossRef]

Kirby, M.

Laughlin, S.

S. Laughlin, “A simple coding procedure enhances a neuron’s information capacity,”Z. Naturforsch. 36, 910–912 (1981).

Laughlin, S. B.

M. V. Srinivasan, S. B. Laughlin, A. Dubs, “Predictive coding: a fresh view of lateral inhibition,” Proc. R. Soc. London Ser. B 216, 427–459 (1982).
[CrossRef]

Limb, J. O.

A. N. Netravali, J. O. Limb, “Picture coding: a review,” Proc. IEEE 68, 336–406 (1980).

Mandelbrot, B. B.

B. B. Mandelbrot, Fractals: Form, Chance and Dimension (Freeman, San Francisco, 1977).

Munro, P.

G. W. Cottrell, P. Munro, D. Zipser, “Image compression by back propagation: an example of extensional programming,” (University of California, San Diego, La Jolla, Calif., 1987).

Netravali, A. N.

A. N. Netravali, J. O. Limb, “Picture coding: a review,” Proc. IEEE 68, 336–406 (1980).

Parks, J. R.

J. R. Parks, “Prediction and entropy of half-tone pictures,” Behav. Sci. 10, 436–445 (1965).
[CrossRef] [PubMed]

Pratt, W. K.

W. K. Pratt, Digital Image Processing (Wiley, New York, 1978).

Richards, W. A.

W. A. Richards, “A lightness scale from image intensity distributions,” Artificial Intelligence Memo 648 (Massachusetts Institute of Technology, Cambridge, Mass., 1981).

Sakitt, B.

B. Sakitt, H. B. Barlow, “A model for the economical encoding of the visual image in cerebral cortex,” Biol. Cybern. 43, 97–108 (1982).
[CrossRef] [PubMed]

Saund, E.

E. Saund, “Dimensionality-reduction using connectionist networks,” Artificial Intelligence Memo 941 (Massachusetts Institute of Technology, Cambridge, Mass., 1987).

Schreiber, W. F.

W. F. Schreiber, “The measurement of third order probability distributions of television signals,”IRE Trans. Inf. Theory, IT-2, 94–105 (1956). Schreiber measured third-order statistics, based on a raster scan, for a 6-bit/pixel image that gave a third-order entropy of 1.5 bits/pixel or redundancy of 75%. This picture was described as the least complicated picture. The average third-order redundancy for the 4-bit/pixel images measured here was 44%. It is difficult to compare these results directly because they depend on both the picture complexity and the quantization. However, estimates of entropy as a function of the degree of quantization typically show a decrease in redundancy as the number of levels increases.22
[CrossRef]

Shannon, C. E.

C. E. Shannon, “Prediction and entropy of printed English,” Bell. Syst. Tech. J. 30, 50–64 (1951).

The entropy for the pseudofractal Gaussian image was calculated by using a discrete form of Shannon’s formula for the entropy of a continuous Guassian source. [See C. E. Shannon, “A mathematical theory of communication,” Bell Syst. Tech. J.27, 371–428 (1948)].

Simon, J. G.

J. G. Simon, “Redondance et complexité,” Psychol. Belg. 12, 255–263 (1972).

Sirovich, L.

Spencer, R. V.

N. S. Tzannes, R. V. Spencer, A. J. Kaplan, “On estimating the entropy of random fields,” Inf. Control 16, 1–6 (1970).
[CrossRef]

Srinivasan, M. V.

M. V. Srinivasan, S. B. Laughlin, A. Dubs, “Predictive coding: a fresh view of lateral inhibition,” Proc. R. Soc. London Ser. B 216, 427–459 (1982).
[CrossRef]

Tavitian, J.

J. V. Gaven, J. Tavitian, A. Harabedian, “The informative value of sampled images as a function of the number of gray levels used in encoding the images,” Photogr. Sci. Eng. 14, 16–20 (1970).

Tzannes, N. S.

N. S. Tzannes, R. V. Spencer, A. J. Kaplan, “On estimating the entropy of random fields,” Inf. Control 16, 1–6 (1970).
[CrossRef]

Watson, A. B.

Wintz, P.

R. C. Gonzalezand, P. Wintz, Digital Image Processing (Addison-Wesley, Reading, Mass., 1977).

Zipser, D.

G. W. Cottrell, P. Munro, D. Zipser, “Image compression by back propagation: an example of extensional programming,” (University of California, San Diego, La Jolla, Calif., 1987).

Behav. Sci. (1)

J. R. Parks, “Prediction and entropy of half-tone pictures,” Behav. Sci. 10, 436–445 (1965).
[CrossRef] [PubMed]

Bell. Syst. Tech. J. (1)

C. E. Shannon, “Prediction and entropy of printed English,” Bell. Syst. Tech. J. 30, 50–64 (1951).

Biol. Cybern. (1)

B. Sakitt, H. B. Barlow, “A model for the economical encoding of the visual image in cerebral cortex,” Biol. Cybern. 43, 97–108 (1982).
[CrossRef] [PubMed]

IEEE Trans. Inf. Theory (1)

T. M. Cover, R. C. King, “A convergent gambling estimate of the entropy of English,”IEEE Trans. Inf. Theory IT-24, 413–421 (1978).
[CrossRef]

Inf. Control (1)

N. S. Tzannes, R. V. Spencer, A. J. Kaplan, “On estimating the entropy of random fields,” Inf. Control 16, 1–6 (1970).
[CrossRef]

IRE Trans. Inf. Theory (1)

W. F. Schreiber, “The measurement of third order probability distributions of television signals,”IRE Trans. Inf. Theory, IT-2, 94–105 (1956). Schreiber measured third-order statistics, based on a raster scan, for a 6-bit/pixel image that gave a third-order entropy of 1.5 bits/pixel or redundancy of 75%. This picture was described as the least complicated picture. The average third-order redundancy for the 4-bit/pixel images measured here was 44%. It is difficult to compare these results directly because they depend on both the picture complexity and the quantization. However, estimates of entropy as a function of the degree of quantization typically show a decrease in redundancy as the number of levels increases.22
[CrossRef]

J. Opt. Soc. Am. A (3)

Photogr. Sci. Eng. (1)

J. V. Gaven, J. Tavitian, A. Harabedian, “The informative value of sampled images as a function of the number of gray levels used in encoding the images,” Photogr. Sci. Eng. 14, 16–20 (1970).

Proc. IEEE (1)

A. N. Netravali, J. O. Limb, “Picture coding: a review,” Proc. IEEE 68, 336–406 (1980).

Proc. R. Soc. London Ser. B (2)

G. Buchsbaum, A. Gottschalk, “Trichromacy, opponent colour coding and optimum information transmission in the retina,” Proc. R. Soc. London Ser. B 220, 89–113 (1983).
[CrossRef]

M. V. Srinivasan, S. B. Laughlin, A. Dubs, “Predictive coding: a fresh view of lateral inhibition,” Proc. R. Soc. London Ser. B 216, 427–459 (1982).
[CrossRef]

Psych. Rev. (1)

F. Attneave, “Informational aspects of visual perception,” Psych. Rev. 61, 183–193 (1954).
[CrossRef]

Psychol. Belg. (1)

J. G. Simon, “Redondance et complexité,” Psychol. Belg. 12, 255–263 (1972).

Z. Naturforsch. (1)

S. Laughlin, “A simple coding procedure enhances a neuron’s information capacity,”Z. Naturforsch. 36, 910–912 (1981).

Other (15)

H. B. Barlow, “Sensory mechanisms, the reduction of redundancy, and intelligence,” in Proceedings of the National Physical Laboratory Symposium on the Mechanization of Thought Processes (H. M. Stationery Office, London, 1959), No. 10, pp. 535–539.

H. B. Barlow, “Perception: what quantitative laws govern the acquisition of knowledge from the senses?” in Functions of the Brain, Wolfson College Lectures, C. Cohen, ed. (Clarendon, Oxford, 1985), pp. 11–43.

W. A. Richards, “A lightness scale from image intensity distributions,” Artificial Intelligence Memo 648 (Massachusetts Institute of Technology, Cambridge, Mass., 1981).

E. Saund, “Dimensionality-reduction using connectionist networks,” Artificial Intelligence Memo 941 (Massachusetts Institute of Technology, Cambridge, Mass., 1987).

G. W. Cottrell, P. Munro, D. Zipser, “Image compression by back propagation: an example of extensional programming,” (University of California, San Diego, La Jolla, Calif., 1987).

The experiment was run with an IBM/XT with a Number Nine Revolution graphics board, a Microsoft Mouse, and a Tektronix GMA 301 display. The programs were written in Lattice C with the halo subroutine library. Pictures were digitized with an 8-bit/pixel Chorus Data Systems digitizer and an RCA camera.

B. B. Mandelbrot, Fractals: Form, Chance and Dimension (Freeman, San Francisco, 1977).

If the gray levels are uniformly distributed, then the histogram for the required number of correct guesses for each gray level is uniformly distributed as well.

The entropy for the pseudofractal Gaussian image was calculated by using a discrete form of Shannon’s formula for the entropy of a continuous Guassian source. [See C. E. Shannon, “A mathematical theory of communication,” Bell Syst. Tech. J.27, 371–428 (1948)].

Relative entropy is Fn/m. The median nearest-neighbor predictor gave upper bounds of relative entropy, which increased by an average of 0.56 for each 1-bit/pixel increase in quantization from 2 to 6 bits/pixel, for leaves, four elders, and hbb. The relative entropy bound leveled off between 6 and 7 bits/pixel, probably because of the degradation in the original imaging process.

An eye’s required capacity can be estimated roughly as follows. The photon flux at 555 nm, corresponding to 1 Td, is about 1.25 × 106 photons sec−1deg−2. If we assume a receptor diameter of about 0.008 deg, then 10 nits for 33 msec with a 3-mm pupil corresponds to about 150 photons. We assume an effective discretization of levels according to Poisson statistics, which separates levels by 1 standard deviation. There are then about 17 levels, or 4 bits, going from 150 to 15 photons. A 1 deg × 1 deg area divided by the receptor area is 125 × 125 pixels. The actual required capacity would be somewhat less than 125 × 125 × 4 bits, because of a nonrectangular modulation transfer function and nonuniform receptor spacing.

If i, j, and k represent the gray levels of three consecutive pixels, the third-order conditional entropy, F(k|i, j), can be estimated from the relative frequencies of sequences of ijk and ij. If these frequencies are denoted by p(i, j, k) and p(i, j), respectively, then the third-order conditional entropy can be calculated from the difference between the third-order and second-order joint entropy:-∑i,j,kp(i,j,k)log2 p(i,j,k)+∑i,jp(i,j)log2p(i,j).For comparison with nearest-neighbor predictors, it would be interesting to know the eighth-order entropies, based on the eight nearest neighbors. However, this would require gathering data. on 168-dimensional histograms. This is prohibitive because of the number of samples, and hence the number of pictures, that would need to be measured and because of the computational demands.

W. K. Pratt, Digital Image Processing (Wiley, New York, 1978).

H. C. Andrews, Introduction to Mathematical Techniques in Pattern Recognition (Krieger, Malabar, Fla., 1983).

R. C. Gonzalezand, P. Wintz, Digital Image Processing (Addison-Wesley, Reading, Mass., 1977).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (3)

Fig. 1
Fig. 1

Picture hbb quantized to 128 × 128 × 4 bits. (a), (b), and (c) have increasing fractions of deleted pixels. About 1% and 100% of the pixels have been deleted from (a) and (c), respectively.

Fig. 2
Fig. 2

The percentage of trials for which the observer got the right answer as a function of the per cent deleted and the number of tries (1 to 16) for observer DJK for the picture of the man’s face (hbb).

Fig. 3
Fig. 3

Upper and lower entropy bounds, in bits per pixel, are shown for the picture of the man’s face as a function of the number of undeleted pixels, that is, the number of known pixels for observer DJK (triangles). Upper entropy bounds are also shown (squares) for a median predictor that chooses the second brightest gray-level of the four nearest neighbors as its first guess (see the text for more details).

Tables (1)

Tables Icon

Table 1 Upper Entropy Bounds Estimated from Gray-Level Predictability

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

F n = - i , j p ( i , b j ) log 2 p ( i b j ) ,
1 - F n m .
upper bound = - i = 1 2 m q i N log 2 q i N ,
lower bound = - i = 1 2 m i ( q i N - q i + 1 N ) log 2 i .
-i,j,kp(i,j,k)log2p(i,j,k)+i,jp(i,j)log2p(i,j).

Metrics