Abstract

A method for deconvolution of the true image of an object from two recorded images is proposed. These two images have to be made by an imaging system with two different but interconnected kernels. The method is formulated as a system of Fredholm equations of the first kind reduced to a single functional equation in Fourier space. The kernels of the system and the true image of an object are found from the same recorded images.

© 2013 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. M. Demenikov and A. R. Harvey, “Parametric blind-deconvolution algorithm to remove image artifacts in hybrid imaging systems,” Opt. Express18(17), 18035–18040 (2010).
    [PubMed]
  2. W. Dong, H. Feng, Z. Xu, and Q. Li, “A piecewise local regularized Richardson–Lucy algorithm for remote sensing image deconvolution,” Opt. Laser Technol.43(5), 926–933 (2011).
    [CrossRef]
  3. N. Wiener, The Extrapolation, Interpolation and Smoothing of Stationary Time series (John Wiley & Sons, 1949).
  4. W. H. Richardson, “Bayesian-based iterative method of image restoration,” J. Opt. Soc. Am.62(1), 55–59 (1972).
    [CrossRef]
  5. L. B. Lucy, “An iterative technique for the rectification of observed distributions,” Astron. J.79, 745–754 (1974).
    [CrossRef]
  6. A. N. Tikhonov and V. Y. Arsenin, Methods for Solution of Incorrect Problems (Nauka, 1986).
  7. V. A. Gorelik, “Method of recovering the fine structure of a spectrum without measuring the instrument function of the spectrometer,” Tech. Phys.39, 444–446 (1994).
  8. V. A. Gorelik and A. V. Yakovenko, “Fine structure extraction without analyser function measurement,” J. Elec. Spec. Rel. Phenom.73(1), R1–R3 (1995).
    [CrossRef]
  9. R. Raskar, A. Agrawal, and J. Tumblin, “Coded exposure photography: motion deblurring using fluttered shutter,” ACM Trans. Graph.25(3), 795–804 (2006).
    [CrossRef]
  10. A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph.26(3), 69 (2007).
    [CrossRef]
  11. Y. Y. Schechner and N. Kiryati, “Depth from Defocus vs. Stereo: How Different Really Are They?” Int. J. Comput. Vis.39(2), 141–162 (2000).
    [CrossRef]
  12. D. Rajan and S. Chaudhuri, “Simultaneous Estimation of Super-Resolved Scene and Depth Map from Low Resolution Defocused Observations,” IEEE Trans. Pattern Anal. Mach. Intell.25(9), 1102–1117 (2003).
    [CrossRef]
  13. P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via Diffusion,” IEEE Trans. Pattern Anal. Mach. Intell.30(3), 518–531 (2008).
    [CrossRef] [PubMed]
  14. J. K. Doylend, M. J. R. Heck, J. T. Bovington, J. D. Peters, L. A. Coldren, and J. E. Bowers, “Two-dimensional free-space beam steering with an optical phased array on silicon-on-insulator,” Opt. Express19(22), 21595–21604 (2011).
    [CrossRef] [PubMed]
  15. A. R. Vasishtha, Complex Analysis, 11th ed. (Krishna Prakashan Media, 2010).
  16. W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery, Numerical Recipes in C: The Art of Scientific Computing, 2nd ed. (Cambridge University Press, 1992), Chap. 13.

2011 (2)

W. Dong, H. Feng, Z. Xu, and Q. Li, “A piecewise local regularized Richardson–Lucy algorithm for remote sensing image deconvolution,” Opt. Laser Technol.43(5), 926–933 (2011).
[CrossRef]

J. K. Doylend, M. J. R. Heck, J. T. Bovington, J. D. Peters, L. A. Coldren, and J. E. Bowers, “Two-dimensional free-space beam steering with an optical phased array on silicon-on-insulator,” Opt. Express19(22), 21595–21604 (2011).
[CrossRef] [PubMed]

2010 (1)

2008 (1)

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via Diffusion,” IEEE Trans. Pattern Anal. Mach. Intell.30(3), 518–531 (2008).
[CrossRef] [PubMed]

2007 (1)

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph.26(3), 69 (2007).
[CrossRef]

2006 (1)

R. Raskar, A. Agrawal, and J. Tumblin, “Coded exposure photography: motion deblurring using fluttered shutter,” ACM Trans. Graph.25(3), 795–804 (2006).
[CrossRef]

2003 (1)

D. Rajan and S. Chaudhuri, “Simultaneous Estimation of Super-Resolved Scene and Depth Map from Low Resolution Defocused Observations,” IEEE Trans. Pattern Anal. Mach. Intell.25(9), 1102–1117 (2003).
[CrossRef]

2000 (1)

Y. Y. Schechner and N. Kiryati, “Depth from Defocus vs. Stereo: How Different Really Are They?” Int. J. Comput. Vis.39(2), 141–162 (2000).
[CrossRef]

1995 (1)

V. A. Gorelik and A. V. Yakovenko, “Fine structure extraction without analyser function measurement,” J. Elec. Spec. Rel. Phenom.73(1), R1–R3 (1995).
[CrossRef]

1994 (1)

V. A. Gorelik, “Method of recovering the fine structure of a spectrum without measuring the instrument function of the spectrometer,” Tech. Phys.39, 444–446 (1994).

1974 (1)

L. B. Lucy, “An iterative technique for the rectification of observed distributions,” Astron. J.79, 745–754 (1974).
[CrossRef]

1972 (1)

Agrawal, A.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph.26(3), 69 (2007).
[CrossRef]

R. Raskar, A. Agrawal, and J. Tumblin, “Coded exposure photography: motion deblurring using fluttered shutter,” ACM Trans. Graph.25(3), 795–804 (2006).
[CrossRef]

Bovington, J. T.

Bowers, J. E.

Burger, M.

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via Diffusion,” IEEE Trans. Pattern Anal. Mach. Intell.30(3), 518–531 (2008).
[CrossRef] [PubMed]

Chaudhuri, S.

D. Rajan and S. Chaudhuri, “Simultaneous Estimation of Super-Resolved Scene and Depth Map from Low Resolution Defocused Observations,” IEEE Trans. Pattern Anal. Mach. Intell.25(9), 1102–1117 (2003).
[CrossRef]

Coldren, L. A.

Demenikov, M.

Dong, W.

W. Dong, H. Feng, Z. Xu, and Q. Li, “A piecewise local regularized Richardson–Lucy algorithm for remote sensing image deconvolution,” Opt. Laser Technol.43(5), 926–933 (2011).
[CrossRef]

Doylend, J. K.

Favaro, P.

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via Diffusion,” IEEE Trans. Pattern Anal. Mach. Intell.30(3), 518–531 (2008).
[CrossRef] [PubMed]

Feng, H.

W. Dong, H. Feng, Z. Xu, and Q. Li, “A piecewise local regularized Richardson–Lucy algorithm for remote sensing image deconvolution,” Opt. Laser Technol.43(5), 926–933 (2011).
[CrossRef]

Gorelik, V. A.

V. A. Gorelik and A. V. Yakovenko, “Fine structure extraction without analyser function measurement,” J. Elec. Spec. Rel. Phenom.73(1), R1–R3 (1995).
[CrossRef]

V. A. Gorelik, “Method of recovering the fine structure of a spectrum without measuring the instrument function of the spectrometer,” Tech. Phys.39, 444–446 (1994).

Harvey, A. R.

Heck, M. J. R.

Kiryati, N.

Y. Y. Schechner and N. Kiryati, “Depth from Defocus vs. Stereo: How Different Really Are They?” Int. J. Comput. Vis.39(2), 141–162 (2000).
[CrossRef]

Li, Q.

W. Dong, H. Feng, Z. Xu, and Q. Li, “A piecewise local regularized Richardson–Lucy algorithm for remote sensing image deconvolution,” Opt. Laser Technol.43(5), 926–933 (2011).
[CrossRef]

Lucy, L. B.

L. B. Lucy, “An iterative technique for the rectification of observed distributions,” Astron. J.79, 745–754 (1974).
[CrossRef]

Mohan, A.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph.26(3), 69 (2007).
[CrossRef]

Osher, S. J.

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via Diffusion,” IEEE Trans. Pattern Anal. Mach. Intell.30(3), 518–531 (2008).
[CrossRef] [PubMed]

Peters, J. D.

Rajan, D.

D. Rajan and S. Chaudhuri, “Simultaneous Estimation of Super-Resolved Scene and Depth Map from Low Resolution Defocused Observations,” IEEE Trans. Pattern Anal. Mach. Intell.25(9), 1102–1117 (2003).
[CrossRef]

Raskar, R.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph.26(3), 69 (2007).
[CrossRef]

R. Raskar, A. Agrawal, and J. Tumblin, “Coded exposure photography: motion deblurring using fluttered shutter,” ACM Trans. Graph.25(3), 795–804 (2006).
[CrossRef]

Richardson, W. H.

Schechner, Y. Y.

Y. Y. Schechner and N. Kiryati, “Depth from Defocus vs. Stereo: How Different Really Are They?” Int. J. Comput. Vis.39(2), 141–162 (2000).
[CrossRef]

Soatto, S.

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via Diffusion,” IEEE Trans. Pattern Anal. Mach. Intell.30(3), 518–531 (2008).
[CrossRef] [PubMed]

Tumblin, J.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph.26(3), 69 (2007).
[CrossRef]

R. Raskar, A. Agrawal, and J. Tumblin, “Coded exposure photography: motion deblurring using fluttered shutter,” ACM Trans. Graph.25(3), 795–804 (2006).
[CrossRef]

Veeraraghavan, A.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph.26(3), 69 (2007).
[CrossRef]

Xu, Z.

W. Dong, H. Feng, Z. Xu, and Q. Li, “A piecewise local regularized Richardson–Lucy algorithm for remote sensing image deconvolution,” Opt. Laser Technol.43(5), 926–933 (2011).
[CrossRef]

Yakovenko, A. V.

V. A. Gorelik and A. V. Yakovenko, “Fine structure extraction without analyser function measurement,” J. Elec. Spec. Rel. Phenom.73(1), R1–R3 (1995).
[CrossRef]

ACM Trans. Graph. (2)

R. Raskar, A. Agrawal, and J. Tumblin, “Coded exposure photography: motion deblurring using fluttered shutter,” ACM Trans. Graph.25(3), 795–804 (2006).
[CrossRef]

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph.26(3), 69 (2007).
[CrossRef]

Astron. J. (1)

L. B. Lucy, “An iterative technique for the rectification of observed distributions,” Astron. J.79, 745–754 (1974).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell. (2)

D. Rajan and S. Chaudhuri, “Simultaneous Estimation of Super-Resolved Scene and Depth Map from Low Resolution Defocused Observations,” IEEE Trans. Pattern Anal. Mach. Intell.25(9), 1102–1117 (2003).
[CrossRef]

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via Diffusion,” IEEE Trans. Pattern Anal. Mach. Intell.30(3), 518–531 (2008).
[CrossRef] [PubMed]

Int. J. Comput. Vis. (1)

Y. Y. Schechner and N. Kiryati, “Depth from Defocus vs. Stereo: How Different Really Are They?” Int. J. Comput. Vis.39(2), 141–162 (2000).
[CrossRef]

J. Elec. Spec. Rel. Phenom. (1)

V. A. Gorelik and A. V. Yakovenko, “Fine structure extraction without analyser function measurement,” J. Elec. Spec. Rel. Phenom.73(1), R1–R3 (1995).
[CrossRef]

J. Opt. Soc. Am. (1)

Opt. Express (2)

Opt. Laser Technol. (1)

W. Dong, H. Feng, Z. Xu, and Q. Li, “A piecewise local regularized Richardson–Lucy algorithm for remote sensing image deconvolution,” Opt. Laser Technol.43(5), 926–933 (2011).
[CrossRef]

Tech. Phys. (1)

V. A. Gorelik, “Method of recovering the fine structure of a spectrum without measuring the instrument function of the spectrometer,” Tech. Phys.39, 444–446 (1994).

Other (4)

N. Wiener, The Extrapolation, Interpolation and Smoothing of Stationary Time series (John Wiley & Sons, 1949).

A. N. Tikhonov and V. Y. Arsenin, Methods for Solution of Incorrect Problems (Nauka, 1986).

A. R. Vasishtha, Complex Analysis, 11th ed. (Krishna Prakashan Media, 2010).

W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery, Numerical Recipes in C: The Art of Scientific Computing, 2nd ed. (Cambridge University Press, 1992), Chap. 13.

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (2)

Fig. 1
Fig. 1

The relationship between two kernels of the imaging system.

Fig. 2
Fig. 2

The true image F(x, y), a recorded image I1, a recorded image I2, and the recovered image Frecovered. All these images are negatives.

Equations (31)

Equations on this page are rendered with MathJax. Learn more.

I( x 0 , y 0 )= K[x x 0 ,y y 0 ]F(x,y)dxdy,
I( x 0 )= A(x x 0 )S(x)dx,
I 1 ( x 0 )= A(x x 0 )S(x)dx,
I 2 ( x 0 )= A[ x x 0 m ] S(x)dx
i 1 (w)=a(w)s(w),
i 2 (w)=ma(mw)s(w),
i 2 (w/m)s(w)=m i 1 (w)s(w/m).
i 2 (w/m)r(w)=m i 1 (w)r(w/m).
r(w) s(w) = r(w/m) s(w/m) = r(w/ m 2 ) s(w/ m 2 ) =...= r(w/ m N ) s(w/ m N ) .
s(w)= i 1 (w) 1 m i 2 (w/m) s(w/m)= i 1 (w) 1 m i 2 (w/m) i 1 (w/m) 1 m i 2 (w/ m 2 ) s(w/ m 2 )=...= Y(w) H(w) s(w/ m N ),
Y(w)= k=0 N i 1 ( w m k ) ,
H(w)= k=1 N 1 m i 2 ( w m k ) .
Y(w)=H(w)s(w).
Y(w)= k=0 i 1 ( w m k ) =exp(3 w 2 )exp(3 w 2 4 )exp(3 w 2 16 )exp(3 w 2 64 )...=exp(4 w 2 ).
k=0 |L( w 2 k )|,
dL(w) dw = 1 i 1 (w) d i 1 (w) dw .
k=K+1 |L( w 2 k )| k=K+1 w 2 k D = wD 2 K .
s(w)= H * (w)Y(w) H * (w)H(w)+ h 2 ,
a(w)= Y * (w)H(w) i 1 (w) Y * (w)Y(w)+ α 2 ,
I 1 ( x 0 , y 0 )= K[x x 0 ,y y 0 ]F(x,y)dxdy= A[x x 0 ]{ B[y y 0 ]F(x,y)dy}dx ,
I 2 ( x 0 , y 0 )= A[ x x 0 m ]{ B[ y y 0 n ]F(x,y)dy}dx ,
i 1 (w,v)=a(w)b(v)f(w,v),
i 2 (w,v)=ma(mw)nb(nv)f(w,v),
i 2 ( w m , v n )f(w,v)=mn i 1 (w,v)f( w m , v n )
f(w,v)= i 1 (w,v) 1 mn i 2 ( w m , v n ) f( w m , v n )= i 1 (w,v) 1 mn i 2 ( w m , v n ) i 1 ( w m , v n ) 1 mn i 2 ( w m 2 , v n 2 ) f( w m 2 , v n 2 )=...
(w,v)= H * (w,v)Y(w,v) H * (w,v)H(w,v)+ φ 2 ,
H(w,v)= k=1 N 1 mn i 2 ( w m k , v n k ,
Y(w,v)= k=0 N i 1 ( w m k , v n k ),
F(x,y)=[exp( (x37) 2 2 )+exp( (x43) 2 2 )][exp( (y37) 2 2 )+exp( (y43) 2 2 )];
A[ x x 0 m ]=exp[ 1 2 ( x x 0 3m ) 2 ],
B[ y y 0 n ]=exp[ 1 2 ( y y 0 3n ) 2 ],

Metrics