Abstract

We consider the capabilities and limits of strategies for single-aperture three-dimensional and extended depth of field optical imaging. We show that reduced spatial resolution is implicit in forward models for light field sampling and that reduced modulation transfer efficiency is intrinsic to pupil coding. We propose a novel strategy based on image space modulation and show that this strategy can be sensitive to high-resolution spatial features across an extended focal volume.

© 2011 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. D. Brady, Optical Imaging and Spectroscopy (Wiley and Optical Society of America, 2009).
    [CrossRef]
  2. A. C. Kak and M. Slaney, Principles of Computerized Tomographic Imaging (Society for Industrial and Applied Mathematics, 2001).
    [CrossRef]
  3. M. A. Neifeld and P. Shankar, “Feature-specific imaging,” Appl. Opt. 42, 3379–3389 (2003).
    [CrossRef] [PubMed]
  4. D. J. Brady, N. Pitsianis, X. Sun, and P. Potuluri, “Compressive sampling and signal inference,” U.S. Patents 7283231, 7432843, 7463179, 7616306, 7463174 and 7427932 (filed 19 July 2005 and issued 16 October 2007).
  5. E. J. Candes, J. K. Romberg, and T. Tao, “Stable signal recovery from incomplete and inaccurate measurements,” Commun. Pure Appl. Math. 59, 1207–1223 (2006).
    [CrossRef]
  6. D. L. Donoho, “Compressed sensing,” IEEE Trans. Inf. Theory 52, 1289–1306 (2006).
    [CrossRef]
  7. M. E. Gehm, R. John, D. J. Brady, R. M. Willett, and T. J. Schulz, “Single-shot compressive spectral imaging with a dual-disperser architecture,” Opt. Express 15, 14013–14027(2007).
    [CrossRef] [PubMed]
  8. A. Wagadarikar, R. John, R. Willett, and D. Brady, “Single disperser design for coded aperture snapshot spectral imaging,” Appl. Opt. 47, B44–B51 (2008).
    [CrossRef] [PubMed]
  9. E. Y. Sidky and X. Pan, “Image reconstruction in circular cone-beam computed tomography by constrained, total-variation minimization,” Phys. Med. Biol. 53, 4777–4807 (2008).
    [CrossRef] [PubMed]
  10. G.-H. Chen, J. Tang, and S. Leng, “Prior image constrained compressed sensing: a method to accurately reconstruct dynamic CT images from highly undersampled projection data sets,” Med. Phys. 35, 660–663 (2008).
    [CrossRef] [PubMed]
  11. D. J. Brady, K. Choi, D. L. Marks, R. Horisaki, and S. Lim, “Compressive holography,” Opt. Express 17, 13040–13049(2009).
    [CrossRef] [PubMed]
  12. A. Ashok and M. A. Neifeld, “Compressive light field imaging,” Proc. SPIE 7690, 76900Q (2010).
    [CrossRef]
  13. D. J. Brady, “Multiplex sensors and the constant radiance theorem,” Opt. Lett. 27, 16–18 (2002).
    [CrossRef]
  14. E. Candes and T. Tao, “Decoding by linear programming,” IEEE Trans. Inf. Theory 51, 4203–4215 (2005).
    [CrossRef]
  15. D. Takhar, J. N. Laska, M. B. Wakin, M. F. Duarte, D. Baron, S. Sarvotham, K. F. Kelly, and R. G. Baraniuk, “A new compressive imaging camera architecture using optical-domain compression,” Proc. SPIE 6065, 606509 (2006).
    [CrossRef]
  16. M. F. Duarte, D. Takhar, J. N. Laska, T. Sun, P. Boufounos, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging using compressive sensing,” presented at IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2008 Show and Tell, Las Vegas, Nevada, USA, 30 March–4 April 2008, URL: http://www.icassp2008.org/ShowAndTell.asp.
  17. P. K. Baheti and M. A. Neifeld, “Feature-specific structured imaging,” Appl. Opt. 45, 7382–7391 (2006).
    [CrossRef] [PubMed]
  18. P. K. Baheti and M. A. Neifeld, “Random projections based feature-specific structured imaging,” Opt. Express 16, 1764–1776 (2008).
    [CrossRef] [PubMed]
  19. A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: mask-enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Transactions on Graphics 26, 69 (2007).
    [CrossRef]
  20. C. K. Liang, T. H. Lin, B. Y. Wong, C. Liu, and H. H. Chen, “Programmable aperture photography: multiplexed light field acquisition,” ACM Trans. Graph. 27, 55 (2008).
    [CrossRef]
  21. E. R. Dowski and W. T. Cathey, “Extended depth of field through wavefront coding,” Appl. Opt. 34, 1859–1866(1995).
    [CrossRef] [PubMed]
  22. J. R. Fienup, “MTF and integration time versus fill factor for sparse-aperture imaging systems,” Proc. SPIE 4091, 43–47(2000).
    [CrossRef]
  23. D. J. Brady, A. Mrozack, and K. Choi, “Sparse aperture coding for compressive sampling,” Proc. SPIE 7818, 78180D(2010).
    [CrossRef]
  24. D. J. Brady, N. P. Pitsianis, and X. B. Sun, “Reference structure tomography,” J. Opt. Soc. Am. A 21, 1140–1147 (2004).
    [CrossRef]
  25. P. Potuluri, U. Gopinathan, J. R. Adleman, and D. J. Brady, “Lensless sensor system using a reference structure,” Opt. Express 11, 965–974 (2003).
    [CrossRef] [PubMed]
  26. P. Potuluri, M. B. Xu, and D. J. Brady, “Imaging with random 3D reference structures,” Opt. Express 11, 2134–2141 (2003).
    [CrossRef] [PubMed]
  27. D. L. Marks, R. A. Stack, and D. J. Brady, “Three-dimensional coherence imaging in the fresnel domain,” Appl. Opt. 38, 1332–1342 (1999).
    [CrossRef]
  28. D. L. Marks, R. A. Stack, D. J. Brady, D. C. Munson, and R. B. Brady, “Visible cone-beam tomography with a lensless interferometric camera,” Science 284, 2164–2166 (1999).
    [CrossRef] [PubMed]
  29. A. A. Wagadarikar, D. L. Marks, K. Choi, R. Horisaki, and D. J. Brady, “Imaging through turbulence using compressive coherence sensing,” Opt. Lett. 35, 838–840 (2010).
    [CrossRef] [PubMed]
  30. S. Basty, M. A. Neifeld, D. Brady, and S. Kraut, “Nonlinear estimation for interferometric imaging,” Opt. Commun. 228, 249–261 (2003).
    [CrossRef]
  31. Z. Zhang and M. Levoy, “Wigner distributions and how they relate to the light field,” in “IEEE International Conference on Computational Photography (ICCP) (IEEE, 2009).
  32. E. C. Cull, D. P. Kowalski, J. B. Burchett, S. D. Feller, and D. J. Brady, “Three-dimensional imaging with the argus sensor array,” Proc. SPIE 4864, 211–222 (2002).
    [CrossRef]
  33. B. S. Wilburn, M. Smulski, H.-H. K. Lee, and M. A. Horowitz, “Light field video camera,” Proc. SPIE 4674, 29–36 (2001).
    [CrossRef]
  34. D. L. Marks, R. Stack, A. J. Johnson, D. J. Brady, and D. C. Munson, “Cone-beam tomography with a digital camera,” Appl. Opt. 40, 1795–1805 (2001).
    [CrossRef]
  35. D. L. Marks, R. A. Stack, D. J. Brady, and J. van der Gracht, “Three-dimensional tomography using a cubic-phase plate extended depth-of-field system,” Opt. Lett. 24, 253–255 (1999).
    [CrossRef]
  36. W. T. Cathey and E. R. Dowski, “New paradigm for imaging systems,” Appl. Opt. 41, 6080–6092 (2002).
    [CrossRef] [PubMed]
  37. J. Ojeda-Castaneda, L. R. Berriel-Valdos, and E. L. Montes, “Line-spread function relatively insensitive to defocus,” Opt. Lett. 8, 458–460 (1983).
    [CrossRef] [PubMed]
  38. W. T. Welford, “Use of annular apertures to increase focal depth,” J. Opt. Soc. Am. 50, 749–753 (1960).
    [CrossRef]
  39. W. Chi and N. George, “Electronic imaging using a logarithmic asphere,” Opt. Lett. 26, 875–877 (2001).
    [CrossRef]
  40. A. Greengard, Y. Schechner, and R. Piestun, “Depth from diffracted rotation,” Opt. Lett. 31, 181–183 (2006).
    [CrossRef] [PubMed]
  41. S. Chaudhuri and A. N. Rajagopalan, Depth from Defocus : A Real Aperture Imaging Approach (Springer, 1999).
    [CrossRef]
  42. W. B. Seales and S. Dutta, “Everywhere-in-focus image fusion using controllable cameras,” Proc. SPIE 2905, 227–234 (1996).
    [CrossRef]
  43. H. A. Eltoukhy and S. Kavusi, “A computationally efficient algorithm for multifocus image reconstruction,” Proc. SPIE , 5017, 332–341 (2003).
    [CrossRef]
  44. S. T. Li, J. T. Y. Kwok, I. W. H. Tsang, and Y. N. Wang, “Fusing images with different focuses using support vector machines,” IEEE Trans. Neural Netw. 15, 1555–1561 (2004).
    [CrossRef] [PubMed]
  45. S. T. Li and B. Yang, “Multifocus image fusion using region segmentation and spatial frequency,” Image and Vision Computing 26, 971–979 (2008).
    [CrossRef]
  46. B. Yang and S. T. Li, “Multifocus image fusion and restoration with sparse representation,” IEEE Trans. Instrum. Meas. 59, 884–892 (2010).
    [CrossRef]
  47. S. Hong, J. Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging,” Opt. Express 12, 483–491 (2004).
    [CrossRef] [PubMed]
  48. E. H. Adelson and J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Machine Intell. 14, 99–106 (1992).
    [CrossRef]
  49. M. Levoy, “Light fields and computational imaging,” Computer 39, 46–55 (2006).
    [CrossRef]
  50. A. Walther, “Radiometry and coherence,” J. Opt. Soc. Am. 58, 1256–1259 (1968).
    [CrossRef]
  51. L. Mandel and E. Wolf, Optical Coherence and Quantum Optics (Cambridge University, 1995).
  52. A. R. FitzGerrell, E. R. Dowski, and W. T. Cathey, “Defocus transfer function for circularly symmetric pupils,” Appl. Opt. 36, 5796–5804 (1997).
    [CrossRef] [PubMed]
  53. O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” ACM Trans. Graph. 29, 31 (2010).
    [CrossRef]
  54. D. J. Brady and N. Hagen, “Multiscale lens design,” Opt. Express 17, 10659–10674 (2009).
    [CrossRef] [PubMed]
  55. D. L. Marks and D. J. Brady, “Gigagon: a monocentric lens design imaging 40 gigapixels,” in “Imaging Systems,” OSA Technical Digest (CD) (Optical Society of America, 2010), paper ITuC2.

2010 (6)

A. Ashok and M. A. Neifeld, “Compressive light field imaging,” Proc. SPIE 7690, 76900Q (2010).
[CrossRef]

D. J. Brady, A. Mrozack, and K. Choi, “Sparse aperture coding for compressive sampling,” Proc. SPIE 7818, 78180D(2010).
[CrossRef]

A. A. Wagadarikar, D. L. Marks, K. Choi, R. Horisaki, and D. J. Brady, “Imaging through turbulence using compressive coherence sensing,” Opt. Lett. 35, 838–840 (2010).
[CrossRef] [PubMed]

B. Yang and S. T. Li, “Multifocus image fusion and restoration with sparse representation,” IEEE Trans. Instrum. Meas. 59, 884–892 (2010).
[CrossRef]

O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” ACM Trans. Graph. 29, 31 (2010).
[CrossRef]

D. L. Marks and D. J. Brady, “Gigagon: a monocentric lens design imaging 40 gigapixels,” in “Imaging Systems,” OSA Technical Digest (CD) (Optical Society of America, 2010), paper ITuC2.

2009 (4)

D. J. Brady and N. Hagen, “Multiscale lens design,” Opt. Express 17, 10659–10674 (2009).
[CrossRef] [PubMed]

Z. Zhang and M. Levoy, “Wigner distributions and how they relate to the light field,” in “IEEE International Conference on Computational Photography (ICCP) (IEEE, 2009).

D. J. Brady, K. Choi, D. L. Marks, R. Horisaki, and S. Lim, “Compressive holography,” Opt. Express 17, 13040–13049(2009).
[CrossRef] [PubMed]

D. Brady, Optical Imaging and Spectroscopy (Wiley and Optical Society of America, 2009).
[CrossRef]

2008 (6)

A. Wagadarikar, R. John, R. Willett, and D. Brady, “Single disperser design for coded aperture snapshot spectral imaging,” Appl. Opt. 47, B44–B51 (2008).
[CrossRef] [PubMed]

E. Y. Sidky and X. Pan, “Image reconstruction in circular cone-beam computed tomography by constrained, total-variation minimization,” Phys. Med. Biol. 53, 4777–4807 (2008).
[CrossRef] [PubMed]

G.-H. Chen, J. Tang, and S. Leng, “Prior image constrained compressed sensing: a method to accurately reconstruct dynamic CT images from highly undersampled projection data sets,” Med. Phys. 35, 660–663 (2008).
[CrossRef] [PubMed]

P. K. Baheti and M. A. Neifeld, “Random projections based feature-specific structured imaging,” Opt. Express 16, 1764–1776 (2008).
[CrossRef] [PubMed]

C. K. Liang, T. H. Lin, B. Y. Wong, C. Liu, and H. H. Chen, “Programmable aperture photography: multiplexed light field acquisition,” ACM Trans. Graph. 27, 55 (2008).
[CrossRef]

S. T. Li and B. Yang, “Multifocus image fusion using region segmentation and spatial frequency,” Image and Vision Computing 26, 971–979 (2008).
[CrossRef]

2007 (2)

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: mask-enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Transactions on Graphics 26, 69 (2007).
[CrossRef]

M. E. Gehm, R. John, D. J. Brady, R. M. Willett, and T. J. Schulz, “Single-shot compressive spectral imaging with a dual-disperser architecture,” Opt. Express 15, 14013–14027(2007).
[CrossRef] [PubMed]

2006 (6)

E. J. Candes, J. K. Romberg, and T. Tao, “Stable signal recovery from incomplete and inaccurate measurements,” Commun. Pure Appl. Math. 59, 1207–1223 (2006).
[CrossRef]

D. L. Donoho, “Compressed sensing,” IEEE Trans. Inf. Theory 52, 1289–1306 (2006).
[CrossRef]

D. Takhar, J. N. Laska, M. B. Wakin, M. F. Duarte, D. Baron, S. Sarvotham, K. F. Kelly, and R. G. Baraniuk, “A new compressive imaging camera architecture using optical-domain compression,” Proc. SPIE 6065, 606509 (2006).
[CrossRef]

P. K. Baheti and M. A. Neifeld, “Feature-specific structured imaging,” Appl. Opt. 45, 7382–7391 (2006).
[CrossRef] [PubMed]

M. Levoy, “Light fields and computational imaging,” Computer 39, 46–55 (2006).
[CrossRef]

A. Greengard, Y. Schechner, and R. Piestun, “Depth from diffracted rotation,” Opt. Lett. 31, 181–183 (2006).
[CrossRef] [PubMed]

2005 (1)

E. Candes and T. Tao, “Decoding by linear programming,” IEEE Trans. Inf. Theory 51, 4203–4215 (2005).
[CrossRef]

2004 (3)

2003 (5)

2002 (3)

D. J. Brady, “Multiplex sensors and the constant radiance theorem,” Opt. Lett. 27, 16–18 (2002).
[CrossRef]

E. C. Cull, D. P. Kowalski, J. B. Burchett, S. D. Feller, and D. J. Brady, “Three-dimensional imaging with the argus sensor array,” Proc. SPIE 4864, 211–222 (2002).
[CrossRef]

W. T. Cathey and E. R. Dowski, “New paradigm for imaging systems,” Appl. Opt. 41, 6080–6092 (2002).
[CrossRef] [PubMed]

2001 (4)

W. Chi and N. George, “Electronic imaging using a logarithmic asphere,” Opt. Lett. 26, 875–877 (2001).
[CrossRef]

B. S. Wilburn, M. Smulski, H.-H. K. Lee, and M. A. Horowitz, “Light field video camera,” Proc. SPIE 4674, 29–36 (2001).
[CrossRef]

D. L. Marks, R. Stack, A. J. Johnson, D. J. Brady, and D. C. Munson, “Cone-beam tomography with a digital camera,” Appl. Opt. 40, 1795–1805 (2001).
[CrossRef]

A. C. Kak and M. Slaney, Principles of Computerized Tomographic Imaging (Society for Industrial and Applied Mathematics, 2001).
[CrossRef]

2000 (1)

J. R. Fienup, “MTF and integration time versus fill factor for sparse-aperture imaging systems,” Proc. SPIE 4091, 43–47(2000).
[CrossRef]

1999 (4)

D. L. Marks, R. A. Stack, D. J. Brady, and J. van der Gracht, “Three-dimensional tomography using a cubic-phase plate extended depth-of-field system,” Opt. Lett. 24, 253–255 (1999).
[CrossRef]

D. L. Marks, R. A. Stack, and D. J. Brady, “Three-dimensional coherence imaging in the fresnel domain,” Appl. Opt. 38, 1332–1342 (1999).
[CrossRef]

D. L. Marks, R. A. Stack, D. J. Brady, D. C. Munson, and R. B. Brady, “Visible cone-beam tomography with a lensless interferometric camera,” Science 284, 2164–2166 (1999).
[CrossRef] [PubMed]

S. Chaudhuri and A. N. Rajagopalan, Depth from Defocus : A Real Aperture Imaging Approach (Springer, 1999).
[CrossRef]

1997 (1)

1996 (1)

W. B. Seales and S. Dutta, “Everywhere-in-focus image fusion using controllable cameras,” Proc. SPIE 2905, 227–234 (1996).
[CrossRef]

1995 (2)

1992 (1)

E. H. Adelson and J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Machine Intell. 14, 99–106 (1992).
[CrossRef]

1983 (1)

1968 (1)

1960 (1)

Adelson, E. H.

E. H. Adelson and J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Machine Intell. 14, 99–106 (1992).
[CrossRef]

Adleman, J. R.

Agrawal, A.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: mask-enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Transactions on Graphics 26, 69 (2007).
[CrossRef]

Ashok, A.

A. Ashok and M. A. Neifeld, “Compressive light field imaging,” Proc. SPIE 7690, 76900Q (2010).
[CrossRef]

Baheti, P. K.

Baraniuk, R. G.

M. F. Duarte, D. Takhar, J. N. Laska, T. Sun, P. Boufounos, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging using compressive sensing,” presented at IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2008 Show and Tell, Las Vegas, Nevada, USA, 30 March–4 April 2008, URL: http://www.icassp2008.org/ShowAndTell.asp.

D. Takhar, J. N. Laska, M. B. Wakin, M. F. Duarte, D. Baron, S. Sarvotham, K. F. Kelly, and R. G. Baraniuk, “A new compressive imaging camera architecture using optical-domain compression,” Proc. SPIE 6065, 606509 (2006).
[CrossRef]

Baron, D.

D. Takhar, J. N. Laska, M. B. Wakin, M. F. Duarte, D. Baron, S. Sarvotham, K. F. Kelly, and R. G. Baraniuk, “A new compressive imaging camera architecture using optical-domain compression,” Proc. SPIE 6065, 606509 (2006).
[CrossRef]

Basty, S.

S. Basty, M. A. Neifeld, D. Brady, and S. Kraut, “Nonlinear estimation for interferometric imaging,” Opt. Commun. 228, 249–261 (2003).
[CrossRef]

Berriel-Valdos, L. R.

Boufounos, P.

M. F. Duarte, D. Takhar, J. N. Laska, T. Sun, P. Boufounos, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging using compressive sensing,” presented at IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2008 Show and Tell, Las Vegas, Nevada, USA, 30 March–4 April 2008, URL: http://www.icassp2008.org/ShowAndTell.asp.

Brady, D.

D. Brady, Optical Imaging and Spectroscopy (Wiley and Optical Society of America, 2009).
[CrossRef]

A. Wagadarikar, R. John, R. Willett, and D. Brady, “Single disperser design for coded aperture snapshot spectral imaging,” Appl. Opt. 47, B44–B51 (2008).
[CrossRef] [PubMed]

S. Basty, M. A. Neifeld, D. Brady, and S. Kraut, “Nonlinear estimation for interferometric imaging,” Opt. Commun. 228, 249–261 (2003).
[CrossRef]

Brady, D. J.

A. A. Wagadarikar, D. L. Marks, K. Choi, R. Horisaki, and D. J. Brady, “Imaging through turbulence using compressive coherence sensing,” Opt. Lett. 35, 838–840 (2010).
[CrossRef] [PubMed]

D. J. Brady, A. Mrozack, and K. Choi, “Sparse aperture coding for compressive sampling,” Proc. SPIE 7818, 78180D(2010).
[CrossRef]

D. L. Marks and D. J. Brady, “Gigagon: a monocentric lens design imaging 40 gigapixels,” in “Imaging Systems,” OSA Technical Digest (CD) (Optical Society of America, 2010), paper ITuC2.

D. J. Brady and N. Hagen, “Multiscale lens design,” Opt. Express 17, 10659–10674 (2009).
[CrossRef] [PubMed]

D. J. Brady, K. Choi, D. L. Marks, R. Horisaki, and S. Lim, “Compressive holography,” Opt. Express 17, 13040–13049(2009).
[CrossRef] [PubMed]

M. E. Gehm, R. John, D. J. Brady, R. M. Willett, and T. J. Schulz, “Single-shot compressive spectral imaging with a dual-disperser architecture,” Opt. Express 15, 14013–14027(2007).
[CrossRef] [PubMed]

D. J. Brady, N. P. Pitsianis, and X. B. Sun, “Reference structure tomography,” J. Opt. Soc. Am. A 21, 1140–1147 (2004).
[CrossRef]

P. Potuluri, U. Gopinathan, J. R. Adleman, and D. J. Brady, “Lensless sensor system using a reference structure,” Opt. Express 11, 965–974 (2003).
[CrossRef] [PubMed]

P. Potuluri, M. B. Xu, and D. J. Brady, “Imaging with random 3D reference structures,” Opt. Express 11, 2134–2141 (2003).
[CrossRef] [PubMed]

E. C. Cull, D. P. Kowalski, J. B. Burchett, S. D. Feller, and D. J. Brady, “Three-dimensional imaging with the argus sensor array,” Proc. SPIE 4864, 211–222 (2002).
[CrossRef]

D. J. Brady, “Multiplex sensors and the constant radiance theorem,” Opt. Lett. 27, 16–18 (2002).
[CrossRef]

D. L. Marks, R. Stack, A. J. Johnson, D. J. Brady, and D. C. Munson, “Cone-beam tomography with a digital camera,” Appl. Opt. 40, 1795–1805 (2001).
[CrossRef]

D. L. Marks, R. A. Stack, D. J. Brady, and J. van der Gracht, “Three-dimensional tomography using a cubic-phase plate extended depth-of-field system,” Opt. Lett. 24, 253–255 (1999).
[CrossRef]

D. L. Marks, R. A. Stack, and D. J. Brady, “Three-dimensional coherence imaging in the fresnel domain,” Appl. Opt. 38, 1332–1342 (1999).
[CrossRef]

D. L. Marks, R. A. Stack, D. J. Brady, D. C. Munson, and R. B. Brady, “Visible cone-beam tomography with a lensless interferometric camera,” Science 284, 2164–2166 (1999).
[CrossRef] [PubMed]

D. J. Brady, N. Pitsianis, X. Sun, and P. Potuluri, “Compressive sampling and signal inference,” U.S. Patents 7283231, 7432843, 7463179, 7616306, 7463174 and 7427932 (filed 19 July 2005 and issued 16 October 2007).

Brady, R. B.

D. L. Marks, R. A. Stack, D. J. Brady, D. C. Munson, and R. B. Brady, “Visible cone-beam tomography with a lensless interferometric camera,” Science 284, 2164–2166 (1999).
[CrossRef] [PubMed]

Burchett, J. B.

E. C. Cull, D. P. Kowalski, J. B. Burchett, S. D. Feller, and D. J. Brady, “Three-dimensional imaging with the argus sensor array,” Proc. SPIE 4864, 211–222 (2002).
[CrossRef]

Candes, E.

E. Candes and T. Tao, “Decoding by linear programming,” IEEE Trans. Inf. Theory 51, 4203–4215 (2005).
[CrossRef]

Candes, E. J.

E. J. Candes, J. K. Romberg, and T. Tao, “Stable signal recovery from incomplete and inaccurate measurements,” Commun. Pure Appl. Math. 59, 1207–1223 (2006).
[CrossRef]

Cathey, W. T.

Chaudhuri, S.

S. Chaudhuri and A. N. Rajagopalan, Depth from Defocus : A Real Aperture Imaging Approach (Springer, 1999).
[CrossRef]

Chen, G.-H.

G.-H. Chen, J. Tang, and S. Leng, “Prior image constrained compressed sensing: a method to accurately reconstruct dynamic CT images from highly undersampled projection data sets,” Med. Phys. 35, 660–663 (2008).
[CrossRef] [PubMed]

Chen, H. H.

C. K. Liang, T. H. Lin, B. Y. Wong, C. Liu, and H. H. Chen, “Programmable aperture photography: multiplexed light field acquisition,” ACM Trans. Graph. 27, 55 (2008).
[CrossRef]

Chi, W.

Choi, K.

Cossairt, O.

O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” ACM Trans. Graph. 29, 31 (2010).
[CrossRef]

Cull, E. C.

E. C. Cull, D. P. Kowalski, J. B. Burchett, S. D. Feller, and D. J. Brady, “Three-dimensional imaging with the argus sensor array,” Proc. SPIE 4864, 211–222 (2002).
[CrossRef]

Donoho, D. L.

D. L. Donoho, “Compressed sensing,” IEEE Trans. Inf. Theory 52, 1289–1306 (2006).
[CrossRef]

Dowski, E. R.

Duarte, M. F.

M. F. Duarte, D. Takhar, J. N. Laska, T. Sun, P. Boufounos, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging using compressive sensing,” presented at IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2008 Show and Tell, Las Vegas, Nevada, USA, 30 March–4 April 2008, URL: http://www.icassp2008.org/ShowAndTell.asp.

D. Takhar, J. N. Laska, M. B. Wakin, M. F. Duarte, D. Baron, S. Sarvotham, K. F. Kelly, and R. G. Baraniuk, “A new compressive imaging camera architecture using optical-domain compression,” Proc. SPIE 6065, 606509 (2006).
[CrossRef]

Dutta, S.

W. B. Seales and S. Dutta, “Everywhere-in-focus image fusion using controllable cameras,” Proc. SPIE 2905, 227–234 (1996).
[CrossRef]

Eltoukhy, H. A.

H. A. Eltoukhy and S. Kavusi, “A computationally efficient algorithm for multifocus image reconstruction,” Proc. SPIE , 5017, 332–341 (2003).
[CrossRef]

Feller, S. D.

E. C. Cull, D. P. Kowalski, J. B. Burchett, S. D. Feller, and D. J. Brady, “Three-dimensional imaging with the argus sensor array,” Proc. SPIE 4864, 211–222 (2002).
[CrossRef]

Fienup, J. R.

J. R. Fienup, “MTF and integration time versus fill factor for sparse-aperture imaging systems,” Proc. SPIE 4091, 43–47(2000).
[CrossRef]

FitzGerrell, A. R.

Gehm, M. E.

George, N.

Gopinathan, U.

Greengard, A.

Hagen, N.

Hong, S.

Horisaki, R.

Horowitz, M. A.

B. S. Wilburn, M. Smulski, H.-H. K. Lee, and M. A. Horowitz, “Light field video camera,” Proc. SPIE 4674, 29–36 (2001).
[CrossRef]

Jang, J.

Javidi, B.

John, R.

Johnson, A. J.

Kak, A. C.

A. C. Kak and M. Slaney, Principles of Computerized Tomographic Imaging (Society for Industrial and Applied Mathematics, 2001).
[CrossRef]

Kavusi, S.

H. A. Eltoukhy and S. Kavusi, “A computationally efficient algorithm for multifocus image reconstruction,” Proc. SPIE , 5017, 332–341 (2003).
[CrossRef]

Kelly, K. F.

M. F. Duarte, D. Takhar, J. N. Laska, T. Sun, P. Boufounos, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging using compressive sensing,” presented at IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2008 Show and Tell, Las Vegas, Nevada, USA, 30 March–4 April 2008, URL: http://www.icassp2008.org/ShowAndTell.asp.

D. Takhar, J. N. Laska, M. B. Wakin, M. F. Duarte, D. Baron, S. Sarvotham, K. F. Kelly, and R. G. Baraniuk, “A new compressive imaging camera architecture using optical-domain compression,” Proc. SPIE 6065, 606509 (2006).
[CrossRef]

Kowalski, D. P.

E. C. Cull, D. P. Kowalski, J. B. Burchett, S. D. Feller, and D. J. Brady, “Three-dimensional imaging with the argus sensor array,” Proc. SPIE 4864, 211–222 (2002).
[CrossRef]

Kraut, S.

S. Basty, M. A. Neifeld, D. Brady, and S. Kraut, “Nonlinear estimation for interferometric imaging,” Opt. Commun. 228, 249–261 (2003).
[CrossRef]

Kwok, J. T. Y.

S. T. Li, J. T. Y. Kwok, I. W. H. Tsang, and Y. N. Wang, “Fusing images with different focuses using support vector machines,” IEEE Trans. Neural Netw. 15, 1555–1561 (2004).
[CrossRef] [PubMed]

Laska, J. N.

M. F. Duarte, D. Takhar, J. N. Laska, T. Sun, P. Boufounos, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging using compressive sensing,” presented at IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2008 Show and Tell, Las Vegas, Nevada, USA, 30 March–4 April 2008, URL: http://www.icassp2008.org/ShowAndTell.asp.

D. Takhar, J. N. Laska, M. B. Wakin, M. F. Duarte, D. Baron, S. Sarvotham, K. F. Kelly, and R. G. Baraniuk, “A new compressive imaging camera architecture using optical-domain compression,” Proc. SPIE 6065, 606509 (2006).
[CrossRef]

Lee, H.-H. K.

B. S. Wilburn, M. Smulski, H.-H. K. Lee, and M. A. Horowitz, “Light field video camera,” Proc. SPIE 4674, 29–36 (2001).
[CrossRef]

Leng, S.

G.-H. Chen, J. Tang, and S. Leng, “Prior image constrained compressed sensing: a method to accurately reconstruct dynamic CT images from highly undersampled projection data sets,” Med. Phys. 35, 660–663 (2008).
[CrossRef] [PubMed]

Levoy, M.

Z. Zhang and M. Levoy, “Wigner distributions and how they relate to the light field,” in “IEEE International Conference on Computational Photography (ICCP) (IEEE, 2009).

M. Levoy, “Light fields and computational imaging,” Computer 39, 46–55 (2006).
[CrossRef]

Li, S. T.

B. Yang and S. T. Li, “Multifocus image fusion and restoration with sparse representation,” IEEE Trans. Instrum. Meas. 59, 884–892 (2010).
[CrossRef]

S. T. Li and B. Yang, “Multifocus image fusion using region segmentation and spatial frequency,” Image and Vision Computing 26, 971–979 (2008).
[CrossRef]

S. T. Li, J. T. Y. Kwok, I. W. H. Tsang, and Y. N. Wang, “Fusing images with different focuses using support vector machines,” IEEE Trans. Neural Netw. 15, 1555–1561 (2004).
[CrossRef] [PubMed]

Liang, C. K.

C. K. Liang, T. H. Lin, B. Y. Wong, C. Liu, and H. H. Chen, “Programmable aperture photography: multiplexed light field acquisition,” ACM Trans. Graph. 27, 55 (2008).
[CrossRef]

Lim, S.

Lin, T. H.

C. K. Liang, T. H. Lin, B. Y. Wong, C. Liu, and H. H. Chen, “Programmable aperture photography: multiplexed light field acquisition,” ACM Trans. Graph. 27, 55 (2008).
[CrossRef]

Liu, C.

C. K. Liang, T. H. Lin, B. Y. Wong, C. Liu, and H. H. Chen, “Programmable aperture photography: multiplexed light field acquisition,” ACM Trans. Graph. 27, 55 (2008).
[CrossRef]

Mandel, L.

L. Mandel and E. Wolf, Optical Coherence and Quantum Optics (Cambridge University, 1995).

Marks, D. L.

Mohan, A.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: mask-enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Transactions on Graphics 26, 69 (2007).
[CrossRef]

Montes, E. L.

Mrozack, A.

D. J. Brady, A. Mrozack, and K. Choi, “Sparse aperture coding for compressive sampling,” Proc. SPIE 7818, 78180D(2010).
[CrossRef]

Munson, D. C.

D. L. Marks, R. Stack, A. J. Johnson, D. J. Brady, and D. C. Munson, “Cone-beam tomography with a digital camera,” Appl. Opt. 40, 1795–1805 (2001).
[CrossRef]

D. L. Marks, R. A. Stack, D. J. Brady, D. C. Munson, and R. B. Brady, “Visible cone-beam tomography with a lensless interferometric camera,” Science 284, 2164–2166 (1999).
[CrossRef] [PubMed]

Nayar, S.

O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” ACM Trans. Graph. 29, 31 (2010).
[CrossRef]

Neifeld, M. A.

Ojeda-Castaneda, J.

Pan, X.

E. Y. Sidky and X. Pan, “Image reconstruction in circular cone-beam computed tomography by constrained, total-variation minimization,” Phys. Med. Biol. 53, 4777–4807 (2008).
[CrossRef] [PubMed]

Piestun, R.

Pitsianis, N.

D. J. Brady, N. Pitsianis, X. Sun, and P. Potuluri, “Compressive sampling and signal inference,” U.S. Patents 7283231, 7432843, 7463179, 7616306, 7463174 and 7427932 (filed 19 July 2005 and issued 16 October 2007).

Pitsianis, N. P.

Potuluri, P.

P. Potuluri, U. Gopinathan, J. R. Adleman, and D. J. Brady, “Lensless sensor system using a reference structure,” Opt. Express 11, 965–974 (2003).
[CrossRef] [PubMed]

P. Potuluri, M. B. Xu, and D. J. Brady, “Imaging with random 3D reference structures,” Opt. Express 11, 2134–2141 (2003).
[CrossRef] [PubMed]

D. J. Brady, N. Pitsianis, X. Sun, and P. Potuluri, “Compressive sampling and signal inference,” U.S. Patents 7283231, 7432843, 7463179, 7616306, 7463174 and 7427932 (filed 19 July 2005 and issued 16 October 2007).

Rajagopalan, A. N.

S. Chaudhuri and A. N. Rajagopalan, Depth from Defocus : A Real Aperture Imaging Approach (Springer, 1999).
[CrossRef]

Raskar, R.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: mask-enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Transactions on Graphics 26, 69 (2007).
[CrossRef]

Romberg, J. K.

E. J. Candes, J. K. Romberg, and T. Tao, “Stable signal recovery from incomplete and inaccurate measurements,” Commun. Pure Appl. Math. 59, 1207–1223 (2006).
[CrossRef]

Sarvotham, S.

D. Takhar, J. N. Laska, M. B. Wakin, M. F. Duarte, D. Baron, S. Sarvotham, K. F. Kelly, and R. G. Baraniuk, “A new compressive imaging camera architecture using optical-domain compression,” Proc. SPIE 6065, 606509 (2006).
[CrossRef]

Schechner, Y.

Schulz, T. J.

Seales, W. B.

W. B. Seales and S. Dutta, “Everywhere-in-focus image fusion using controllable cameras,” Proc. SPIE 2905, 227–234 (1996).
[CrossRef]

Shankar, P.

Sidky, E. Y.

E. Y. Sidky and X. Pan, “Image reconstruction in circular cone-beam computed tomography by constrained, total-variation minimization,” Phys. Med. Biol. 53, 4777–4807 (2008).
[CrossRef] [PubMed]

Slaney, M.

A. C. Kak and M. Slaney, Principles of Computerized Tomographic Imaging (Society for Industrial and Applied Mathematics, 2001).
[CrossRef]

Smulski, M.

B. S. Wilburn, M. Smulski, H.-H. K. Lee, and M. A. Horowitz, “Light field video camera,” Proc. SPIE 4674, 29–36 (2001).
[CrossRef]

Stack, R.

Stack, R. A.

Sun, T.

M. F. Duarte, D. Takhar, J. N. Laska, T. Sun, P. Boufounos, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging using compressive sensing,” presented at IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2008 Show and Tell, Las Vegas, Nevada, USA, 30 March–4 April 2008, URL: http://www.icassp2008.org/ShowAndTell.asp.

Sun, X.

D. J. Brady, N. Pitsianis, X. Sun, and P. Potuluri, “Compressive sampling and signal inference,” U.S. Patents 7283231, 7432843, 7463179, 7616306, 7463174 and 7427932 (filed 19 July 2005 and issued 16 October 2007).

Sun, X. B.

Takhar, D.

M. F. Duarte, D. Takhar, J. N. Laska, T. Sun, P. Boufounos, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging using compressive sensing,” presented at IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2008 Show and Tell, Las Vegas, Nevada, USA, 30 March–4 April 2008, URL: http://www.icassp2008.org/ShowAndTell.asp.

D. Takhar, J. N. Laska, M. B. Wakin, M. F. Duarte, D. Baron, S. Sarvotham, K. F. Kelly, and R. G. Baraniuk, “A new compressive imaging camera architecture using optical-domain compression,” Proc. SPIE 6065, 606509 (2006).
[CrossRef]

Tang, J.

G.-H. Chen, J. Tang, and S. Leng, “Prior image constrained compressed sensing: a method to accurately reconstruct dynamic CT images from highly undersampled projection data sets,” Med. Phys. 35, 660–663 (2008).
[CrossRef] [PubMed]

Tao, T.

E. J. Candes, J. K. Romberg, and T. Tao, “Stable signal recovery from incomplete and inaccurate measurements,” Commun. Pure Appl. Math. 59, 1207–1223 (2006).
[CrossRef]

E. Candes and T. Tao, “Decoding by linear programming,” IEEE Trans. Inf. Theory 51, 4203–4215 (2005).
[CrossRef]

Tsang, I. W. H.

S. T. Li, J. T. Y. Kwok, I. W. H. Tsang, and Y. N. Wang, “Fusing images with different focuses using support vector machines,” IEEE Trans. Neural Netw. 15, 1555–1561 (2004).
[CrossRef] [PubMed]

Tumblin, J.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: mask-enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Transactions on Graphics 26, 69 (2007).
[CrossRef]

van der Gracht, J.

Veeraraghavan, A.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: mask-enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Transactions on Graphics 26, 69 (2007).
[CrossRef]

Wagadarikar, A.

Wagadarikar, A. A.

Wakin, M. B.

D. Takhar, J. N. Laska, M. B. Wakin, M. F. Duarte, D. Baron, S. Sarvotham, K. F. Kelly, and R. G. Baraniuk, “A new compressive imaging camera architecture using optical-domain compression,” Proc. SPIE 6065, 606509 (2006).
[CrossRef]

Walther, A.

Wang, J. Y. A.

E. H. Adelson and J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Machine Intell. 14, 99–106 (1992).
[CrossRef]

Wang, Y. N.

S. T. Li, J. T. Y. Kwok, I. W. H. Tsang, and Y. N. Wang, “Fusing images with different focuses using support vector machines,” IEEE Trans. Neural Netw. 15, 1555–1561 (2004).
[CrossRef] [PubMed]

Welford, W. T.

Wilburn, B. S.

B. S. Wilburn, M. Smulski, H.-H. K. Lee, and M. A. Horowitz, “Light field video camera,” Proc. SPIE 4674, 29–36 (2001).
[CrossRef]

Willett, R.

Willett, R. M.

Wolf, E.

L. Mandel and E. Wolf, Optical Coherence and Quantum Optics (Cambridge University, 1995).

Wong, B. Y.

C. K. Liang, T. H. Lin, B. Y. Wong, C. Liu, and H. H. Chen, “Programmable aperture photography: multiplexed light field acquisition,” ACM Trans. Graph. 27, 55 (2008).
[CrossRef]

Xu, M. B.

Yang, B.

B. Yang and S. T. Li, “Multifocus image fusion and restoration with sparse representation,” IEEE Trans. Instrum. Meas. 59, 884–892 (2010).
[CrossRef]

S. T. Li and B. Yang, “Multifocus image fusion using region segmentation and spatial frequency,” Image and Vision Computing 26, 971–979 (2008).
[CrossRef]

Zhang, Z.

Z. Zhang and M. Levoy, “Wigner distributions and how they relate to the light field,” in “IEEE International Conference on Computational Photography (ICCP) (IEEE, 2009).

Zhou, C.

O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” ACM Trans. Graph. 29, 31 (2010).
[CrossRef]

ACM Trans. Graph. (2)

C. K. Liang, T. H. Lin, B. Y. Wong, C. Liu, and H. H. Chen, “Programmable aperture photography: multiplexed light field acquisition,” ACM Trans. Graph. 27, 55 (2008).
[CrossRef]

O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” ACM Trans. Graph. 29, 31 (2010).
[CrossRef]

ACM Transactions on Graphics (1)

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: mask-enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Transactions on Graphics 26, 69 (2007).
[CrossRef]

Appl. Opt. (8)

Commun. Pure Appl. Math. (1)

E. J. Candes, J. K. Romberg, and T. Tao, “Stable signal recovery from incomplete and inaccurate measurements,” Commun. Pure Appl. Math. 59, 1207–1223 (2006).
[CrossRef]

Computer (1)

M. Levoy, “Light fields and computational imaging,” Computer 39, 46–55 (2006).
[CrossRef]

IEEE Trans. Inf. Theory (2)

D. L. Donoho, “Compressed sensing,” IEEE Trans. Inf. Theory 52, 1289–1306 (2006).
[CrossRef]

E. Candes and T. Tao, “Decoding by linear programming,” IEEE Trans. Inf. Theory 51, 4203–4215 (2005).
[CrossRef]

IEEE Trans. Instrum. Meas. (1)

B. Yang and S. T. Li, “Multifocus image fusion and restoration with sparse representation,” IEEE Trans. Instrum. Meas. 59, 884–892 (2010).
[CrossRef]

IEEE Trans. Neural Netw. (1)

S. T. Li, J. T. Y. Kwok, I. W. H. Tsang, and Y. N. Wang, “Fusing images with different focuses using support vector machines,” IEEE Trans. Neural Netw. 15, 1555–1561 (2004).
[CrossRef] [PubMed]

IEEE Trans. Pattern Anal. Machine Intell. (1)

E. H. Adelson and J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Machine Intell. 14, 99–106 (1992).
[CrossRef]

Image and Vision Computing (1)

S. T. Li and B. Yang, “Multifocus image fusion using region segmentation and spatial frequency,” Image and Vision Computing 26, 971–979 (2008).
[CrossRef]

J. Opt. Soc. Am. (2)

J. Opt. Soc. Am. A (1)

Med. Phys. (1)

G.-H. Chen, J. Tang, and S. Leng, “Prior image constrained compressed sensing: a method to accurately reconstruct dynamic CT images from highly undersampled projection data sets,” Med. Phys. 35, 660–663 (2008).
[CrossRef] [PubMed]

Opt. Commun. (1)

S. Basty, M. A. Neifeld, D. Brady, and S. Kraut, “Nonlinear estimation for interferometric imaging,” Opt. Commun. 228, 249–261 (2003).
[CrossRef]

Opt. Express (7)

Opt. Lett. (6)

Phys. Med. Biol. (1)

E. Y. Sidky and X. Pan, “Image reconstruction in circular cone-beam computed tomography by constrained, total-variation minimization,” Phys. Med. Biol. 53, 4777–4807 (2008).
[CrossRef] [PubMed]

Proc. SPIE (8)

A. Ashok and M. A. Neifeld, “Compressive light field imaging,” Proc. SPIE 7690, 76900Q (2010).
[CrossRef]

D. Takhar, J. N. Laska, M. B. Wakin, M. F. Duarte, D. Baron, S. Sarvotham, K. F. Kelly, and R. G. Baraniuk, “A new compressive imaging camera architecture using optical-domain compression,” Proc. SPIE 6065, 606509 (2006).
[CrossRef]

E. C. Cull, D. P. Kowalski, J. B. Burchett, S. D. Feller, and D. J. Brady, “Three-dimensional imaging with the argus sensor array,” Proc. SPIE 4864, 211–222 (2002).
[CrossRef]

B. S. Wilburn, M. Smulski, H.-H. K. Lee, and M. A. Horowitz, “Light field video camera,” Proc. SPIE 4674, 29–36 (2001).
[CrossRef]

J. R. Fienup, “MTF and integration time versus fill factor for sparse-aperture imaging systems,” Proc. SPIE 4091, 43–47(2000).
[CrossRef]

D. J. Brady, A. Mrozack, and K. Choi, “Sparse aperture coding for compressive sampling,” Proc. SPIE 7818, 78180D(2010).
[CrossRef]

W. B. Seales and S. Dutta, “Everywhere-in-focus image fusion using controllable cameras,” Proc. SPIE 2905, 227–234 (1996).
[CrossRef]

H. A. Eltoukhy and S. Kavusi, “A computationally efficient algorithm for multifocus image reconstruction,” Proc. SPIE , 5017, 332–341 (2003).
[CrossRef]

Science (1)

D. L. Marks, R. A. Stack, D. J. Brady, D. C. Munson, and R. B. Brady, “Visible cone-beam tomography with a lensless interferometric camera,” Science 284, 2164–2166 (1999).
[CrossRef] [PubMed]

Other (8)

Z. Zhang and M. Levoy, “Wigner distributions and how they relate to the light field,” in “IEEE International Conference on Computational Photography (ICCP) (IEEE, 2009).

M. F. Duarte, D. Takhar, J. N. Laska, T. Sun, P. Boufounos, K. F. Kelly, and R. G. Baraniuk, “Single-pixel imaging using compressive sensing,” presented at IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2008 Show and Tell, Las Vegas, Nevada, USA, 30 March–4 April 2008, URL: http://www.icassp2008.org/ShowAndTell.asp.

D. J. Brady, N. Pitsianis, X. Sun, and P. Potuluri, “Compressive sampling and signal inference,” U.S. Patents 7283231, 7432843, 7463179, 7616306, 7463174 and 7427932 (filed 19 July 2005 and issued 16 October 2007).

D. Brady, Optical Imaging and Spectroscopy (Wiley and Optical Society of America, 2009).
[CrossRef]

A. C. Kak and M. Slaney, Principles of Computerized Tomographic Imaging (Society for Industrial and Applied Mathematics, 2001).
[CrossRef]

S. Chaudhuri and A. N. Rajagopalan, Depth from Defocus : A Real Aperture Imaging Approach (Springer, 1999).
[CrossRef]

D. L. Marks and D. J. Brady, “Gigagon: a monocentric lens design imaging 40 gigapixels,” in “Imaging Systems,” OSA Technical Digest (CD) (Optical Society of America, 2010), paper ITuC2.

L. Mandel and E. Wolf, Optical Coherence and Quantum Optics (Cambridge University, 1995).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (17)

Fig. 1
Fig. 1

Illumination of a camera aperture by a 3D object.

Fig. 2
Fig. 2

Aperture plane geometry.

Fig. 3
Fig. 3

Q ( x , s x ) for A = x p = d i . The x (horizontal) axis is in units of wavelengths, and the s x (vertical) axis is in units of radians.

Fig. 4
Fig. 4

Defocus transfer versus u (in units of A / λ for (a) clear rectangular aperture, (b) cubic phase modulated aperture, (c) high-pass aperture, and (d) diffuse aperture. In each case, | h ^ ( u , z ) | is evaluated at z = 0 , λ 2 / A 2 , 2 λ 2 / A 2 , 5 λ 2 / A 2 , 10 λ 2 / A 2 , and 50 λ 2 / A 2 . The diagonal line in (a) is the “diffraction-limited MTF”. For the clear aperture, the transfer function degrades monotonically as z increases. The aperture is set at A = 10 3 λ in these examples.

Fig. 5
Fig. 5

Defocus transfer versus z for the apertures of Fig. 4. | h ^ ( u , z ) | is shown at u = 0.05 A / λ , 0.5 A / λ , and 0.75 A / λ in each case. The plot for u = 0.5 A λ vanishes for case (c).

Fig. 6
Fig. 6

ϕ ( x ) for aperture (c).

Fig. 7
Fig. 7

MTF s v ( u ) for apertures (a) and (c).

Fig. 8
Fig. 8

Aperture (c) | h ^ ( u , z ) | versus z A 2 for u = 0.45 A , 0.5 A , and 0.5 A .

Fig. 9
Fig. 9

Aperture (c) Arg ( h ^ ( u , z ) ) versus z A 2 for u = 0.45 A , 0.5 A , and 0.5 A .

Fig. 10
Fig. 10

Singular values for unmodulated low-pass object space projections.

Fig. 11
Fig. 11

Object space singular vectors 1 to 64 for the unmodulated low-pass system. The object space dimension (and thus the dimension of each singular vector) is 256 × 32 . The figure shows each singular vector as a 256 × 32 block (each block is constant in z in this case).

Fig. 12
Fig. 12

Object space singular vectors 1 to 64 for the modulated low-pass system. Despite low-pass filtering, full resolution modulation is in x and z, even in the lowest-order singular vectors.

Fig. 13
Fig. 13

Coded aperture modulation in image space.

Fig. 14
Fig. 14

Feature sizes for coded aperture focal tomography coding. (a) cross section of in-focus spot, (b) cross section of spot diffracted from coded aperture to focal plane, and (c) cross section of coded aperture feature for the system of Fig. 13.

Fig. 15
Fig. 15

Singular values for coded aperture image space modulation. The upper smooth curve and upper jagged curves correspond to the singular values for the focal image. The lower smooth and jagged curves plot the singular values for the image focused on the coded aperture.

Fig. 16
Fig. 16

(a) Object space singular vectors for the image focused on the focal plane with no coded aperture, (b) singular vectors for the image focused on the focal plane after modulation by the displaced coded aperture, (c) spatial spectrum of the singular vectors of (a), and (d) spatial spectrum of the singular vectors of (b).

Fig. 17
Fig. 17

(a) Object space singular vectors for the image focused on the coded aperture plane and then propagated to the focal plane with no coded aperture present, (b) singular vectors for the coded aperture plane image modulated by the coded aperture and then propagated to the focal plane, (c) spatial spectrum of the singular vectors of (a), and (d) spatial spectrum of the singular vectors of (b).

Equations (28)

Equations on this page are rendered with MathJax. Learn more.

SNR ( u ) = f ^ ( u ) MTF ( u ) σ ( u ) ,
J ( Δ x , x ¯ ) = f ( x , y , z ) z 2 e i 2 π λ z Δ x · ( x x ¯ ) d x d y d z ,
Δ x = x 1 x 2 ,
x ¯ = x 1 + x 2 2 ,
S ( x , y , z ) = 1 λ 2 z 2 J ( Δ x , x ¯ ) e i 2 π λ Δ x · x ¯ ( 1 F 1 z ) e i 2 π λ z Δ x · x × P * ( x ¯ + Δ x 2 ) P * ( x ¯ Δ x 2 ) d Δ x d x ¯ ,
S ( θ x , θ y , θ z ) = f ˜ ( θ x , θ y , θ z ) * h ( θ x , θ y , θ z ) ,
h ( θ x , θ y , θ z ) = e i 2 π λ Δ x · ( θ x i x + θ y i y x ¯ λ θ z ) P ( x ¯ Δ x 2 ) P * ( x ¯ + Δ x 2 ) d Δ x d x ¯ .
B ( x ¯ , s ) = J ( Δ x , x ¯ ) e i 2 π λ s · Δ x d Δ x ,
B ( x ¯ , s ) = f ( x ¯ s z , z ) z 2 d z ,
S ( x ) = 1 λ 2 F 2 J ( Δ x , x ¯ ) e i 2 π λ F Δ x · x P ( x + x ¯ + Δ x 2 ) P * ( x + x ¯ Δ x 2 ) d Δ x d x ¯ .
S ( x ) = B ( x ¯ , s ) A ( x , x ¯ , s ) d x ¯ d s ,
A ( x , x ¯ , s ) = 1 λ 2 F 2 e i 2 π λ Δ x · ( s + x F ) P ( x + x ¯ + Δ x 2 ) P * ( x + x ¯ Δ x 2 ) d Δ x .
J ( Δ x , x ¯ ) = J ( Δ x , x ¯ ) h ( x ¯ + Δ x 2 x ¯ Δ x 2 ) h * ( x ¯ Δ x 2 x ¯ + Δ x 2 ) d Δ x d x ¯ ,
B i ( x ¯ , s ) = B o ( x ¯ , s ) P ( λ d i u ) P * ( λ d i u + 2 d i s ) e i 4 π λ ( s + λ u ) · ( x ¯ x ¯ ) d x ¯ d u ¯ ,
B i ( x ¯ , s ) B o ( x ¯ , s ) | P ( d i s ) | 2 .
B i ( x ¯ , s ) = B o ( x ¯ , s ) Q ( x ¯ x ¯ , s ) d x ¯ ,
Q ( x , s ) = P ( λ d i u ) P * ( λ d i u + 2 d i s ) i 4 π λ ( s + λ u ) · x d u ¯ .
P ( x , y ) = rect ( x x p A ) rect ( y y p A ) ,
Q ( x , s x ) = { 0 for    s x 2 x p A 2 d i     or     s x 2 x p + A 2 d i 2 ( A 2 x p 2 λ d i + s x λ ) sinc ( 4 ( A 2 x p 2 λ d i + s x λ ) x ) 2 x p A 2 d i < s x < x p d i 2 ( A + 2 x p 2 λ d i s x λ ) sinc ( 4 ( A + 2 x p 2 λ d i s x λ ) x ) x p d i < s x < 2 x p + A 2 d i .
g ( x ) = f ( x , z ) h ( x x , z ) d x d z ,
h ( x , z ) = e 2 π i ( α x α β z ) P ( β + α 2 ) P * ( β α 2 ) d α d β .
g ^ ( u ) = f ^ ( u , z ) h ^ ( u , z ) d z ,
f u ( x , z ) = e 2 π i u x h ^ * ( u , z ) 0 z max | h ^ ( u , z ) | 2 d z .
MTF s v ( u ) = 0 z max | h ^ ( u , z ) | 2 d z 1 | u | | P ( β u 2 ) P ( β + u 2 ) | 2 d β
h ^ ( u , z ) = e 2 π i u β z P ( β + u 2 ) P * ( β u 2 ) d β .
MTF s v ( 0 ) = z max | P ( β ) | 4 d β .
( a )     P ( x ) = rect ( x A ) , ( b )     P ( x ) = e i α x 3 rect ( x A ) , ( c )     P ( x ) = rect ( x A ) rect ( 2 x A ) , ( d )     P ( x ) = e i ϕ ( x ) rect ( x A ) ,
g ( x ) = f ( x , z ) t ( x , z ) h ( x x , z ) d x d z .

Metrics