Abstract

An iterative method of restoring degraded images was developed by treating images, point spread functions, and degraded images as probability-frequency functions and by applying Bayes’s theorem. The method functions effectively in the presence of noise and is adaptable to computer operation.

© 1972 Optical Society of America

Full Article  |  PDF Article
Related Articles
Origins of linear and nonlinear recursive restoration algorithms

Edward S. Meinel
J. Opt. Soc. Am. A 3(6) 787-799 (1986)

Transforming images into block stationary behavior

Robin N. Strickland
Appl. Opt. 22(10) 1462-1473 (1983)

Noise updating repeated Wiener filter and other adaptive noise smoothing filters using local image statistics

Shiaw-Shiang Jiang and Alexander A. Sawchuk
Appl. Opt. 25(14) 2326-2337 (1986)

References

  • View by:
  • |
  • |
  • |

  1. J. L. Harris, J. Opt. Soc. Am. 56, 569 (1966).
    [Crossref]
  2. B. L. McGlamery, J. Opt. Soc. Am. 57, 293 (1967).
    [Crossref]
  3. E. Parzen, Modern Probability Theory and Its Applications (Wiley, New York, 1960), Ch. 3, Sec. 4, p. 119, Eq. 4.16.
  4. The International Dictionary of Mathematics (Van Nostrand, Princeton, N. J., 1960), p. 70,“Bayes’s Theorem.”
  5. R. E. Machol and P. Gray, Recent Developments in Information and Decision Processes (Macmillan, New York, 1962), p. 175,“Stable Estimation.”

1967 (1)

1966 (1)

Gray, P.

R. E. Machol and P. Gray, Recent Developments in Information and Decision Processes (Macmillan, New York, 1962), p. 175,“Stable Estimation.”

Harris, J. L.

Machol, R. E.

R. E. Machol and P. Gray, Recent Developments in Information and Decision Processes (Macmillan, New York, 1962), p. 175,“Stable Estimation.”

McGlamery, B. L.

Parzen, E.

E. Parzen, Modern Probability Theory and Its Applications (Wiley, New York, 1960), Ch. 3, Sec. 4, p. 119, Eq. 4.16.

J. Opt. Soc. Am. (2)

Other (3)

E. Parzen, Modern Probability Theory and Its Applications (Wiley, New York, 1960), Ch. 3, Sec. 4, p. 119, Eq. 4.16.

The International Dictionary of Mathematics (Van Nostrand, Princeton, N. J., 1960), p. 70,“Bayes’s Theorem.”

R. E. Machol and P. Gray, Recent Developments in Information and Decision Processes (Macmillan, New York, 1962), p. 175,“Stable Estimation.”

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1
Fig. 1

Convergence of restoration toward true values of undegraded image.

Fig. 2
Fig. 2

Convergence of correction-factor values (C) toward unity.

Fig. 3
Fig. 3

Restoration with no noise. (A) Original image, (B) degraded image, (C) 10-iteration restoration, (D) 20 iterations, and (E) 30 iterations.

Fig. 4
Fig. 4

Restoration with 0.1 noise. (A) Original image, (B) degraded image, (C) 10-iteration restoration, (D) 20 iterations, and (E) 30 iterations.

Fig. 5
Fig. 5

Restoration with 0.1 noise by the Fourier-transform method with estimated least-squares filter.

Fig. 6
Fig. 6

Restoration of originally uniform image (all values fo Wi,j = 1.0). One value on diagonal of degraded image doubled. (A) H′(0,0) = 2H(0,0), corner: maximum W = 1.380, minimum W = 0.850; (B) H′(1,1) = 2H(1,1): maximum W = 1.474, minimum W = 0.807; (C) H′(2,2) = 2H(2,2): maximum W = 1.494, minimum W = 0.819; (D) H′(3,3) = 2H(3,3), center: maximum W = 1.320, minimum W = 0.863.

Fig. 7
Fig. 7

Restoration of originally uniform image (all values of Wi,j = 1.0). One value on an axis of degraded image doubled. (A) H′(3,0) = 2H(3,0), middle of side: maximum W = 1.348, minimum W = 0.837; (B) H′(3,1) = 2H(3,1): maximum W = 1.307, minimum W = 0.876; (C) H′(3,2) = 2H(3,2): maximum W = 1.315, minimum W = 0.882; (D) H′(3,3) = 2H(3,3), center: maximum W = 1.320, minimum W = 0.863.

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

P ( W i H k ) = P ( H k W i ) P ( W i ) j P ( H k W j ) P ( W j ) ;             i = { 1 , I } , j = { 1 , J } ,             k = { 1 , K } ,
P ( W i ) = k P ( W i H k ) = k P ( W i H k ) P ( H k ) ,
P ( W i ) = k P ( H k W i ) P ( W i ) P ( H k ) j P ( H k W j ) P ( W j ) .
P r + 1 ( W i ) = P r ( W i ) k P ( H k W i ) P ( H k ) j P ( H k W j ) P r ( W j ) ; r = { 0 , 1 , } .
S = j S j ,             j = { 1 , J } .
W i , r + 1 / W = ( W i , r / W ) k ( S i , k / S ) · ( H k / W ) j ( S j , k / S ) · ( W j , r / W )
W i , r + 1 = W i , r k S i , k H k j S j , k W j , r .
W i , r + 1 = W i , r k = i c S k - i + 1 H k j = a b S k - j + 1 W j , r ,
W i , 1 = k = i c S k - i + 1 H k j = a b S k - j + 1
W i , j , r + 1 = W i , j , r m = i e n = j f H m , n S m - i + 1 , n - j + 1 p = a b q = c d W p , q , r S m - p + 1 , n - q + 1 ,
W i , j , 1 = m = i e n = j f H m , n S m - i + 1 , n - j + 1 p = a b q = c d S m - p + 1 , n - q + 1 ,
H = ( 2 10 12 16 60 52 30 82 56 ) ;             S = [ 2 4 6 8 ] .
W = [ 20 60 100 140 ] .
H = H ( 1 + r d ) ,