Abstract

Photography usually requires optics in conjunction with a recording device (an image sensor). Eliminating the optics could lead to new form factors for cameras. Here, we report a simple demonstration of imaging using a bare CMOS sensor that utilizes computation. The technique relies on the space variant point-spread functions resulting from the interaction of a point source in the field of view with the image sensor. These space-variant point-spread functions are combined with a reconstruction algorithm in order to image simple objects displayed on a discrete LED array as well as on an LCD screen. We extended the approach to video imaging. Finally, we performed experiments to analyze the parametric impact of the object distance. Improving the sensor designs and reconstruction algorithms can lead to useful cameras without optics.

© 2017 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
Computational imaging enables a “see-through” lens-less camera

Ganghun Kim and Rajesh Menon
Opt. Express 26(18) 22826-22836 (2018)

Human gesture recognition using three-dimensional integral imaging

V. Javier Traver, Pedro Latorre-Carmona, Eva Salvador-Balaguer, Filiberto Pla, and Bahram Javidi
J. Opt. Soc. Am. A 31(10) 2312-2320 (2014)

Computational multispectral video imaging [Invited]

Peng Wang and Rajesh Menon
J. Opt. Soc. Am. A 35(1) 189-199 (2018)

References

  • View by:
  • |
  • |
  • |

  1. J. Bareau and P. P. Clark, “Design, fabrication and test of miniature plastic panomorph lenses with 180° field of view,” Proc. SPIE 6342, 63421F (2006).
    [Crossref]
  2. A. Bruckner and M. Schoberl, “Diffraction and photometric limits in today’s miniature digital camera systems,” Proc. SPIE 8616, 861617 (2013).
    [Crossref]
  3. T. M. Cannon and E. E. Fenimore, “Coded aperture imaging: many holes make light work,” Opt. Eng. 19, 193283 (1980).
    [Crossref]
  4. M. S. Asif, A. Ayremlou, A. Sankaranarayanan, A. Veeraraghavan, and R. G. Baraniuk, “FlatCam: thin, bare-sensor cameras using coded aperture and computation,” arXiv: 1509.00116v2 (2016).
  5. X. Yuan, H. Jiang, G. Huang, and P. Wilford, “Lensless compressive imaging,” arXiv: 1508.03498 (2015).
  6. H. Jiang, G. Huang, and P. Wilford, “Multi-view in lensless compressive imaging,” APSIPA Trans. Signal Inf. Process. 3, e15 (2014).
    [Crossref]
  7. G. Huang, H. Jiang, K. Matthews, and P. Wilford, “Lensless imaging by compressive sensing,” in Proceedings of IEEE International Conference on Image Processing (IEEE, 2013), pp. 2101–2105.
  8. S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011).
    [Crossref]
  9. B. Adcock, A. C. Hansen, C. Poon, and B. Roman, “Breaking the coherence barrier: a new theory for compressed sensing,” arXiv: 1302.0561 (2014).
  10. P. Ryczkowski, M. Barbier, A. T. Friberg, J. M. Dudley, and G. Genty, “Ghost imaging in the time domain,” Nat. Photonics 10, 167–170 (2016).
    [Crossref]
  11. P. R. Gill and D. G. Stork, “Lensless ultra miniature imagers using odd-symmetry phase gratings,” in Proceedings of Computational Optical Sensing and Imaging, Alexandria, Virginia, 2013.
  12. D. G. Stork and P. R. Gill, “Optical, mathematical and computational foundations of lensless ultra-miniature diffractive imagers and sensors,” Int. J. Adv. Syst. Meas. 7, 201–208 (2014).
  13. J. Tanida, T. Kumagai, K. Yamada, S. Miyatake, K. Ishida, T. Morimoto, N. Kondou, D. Miyazaki, and Y. Ichioka, “Thin observation module by bound optics (TOMBO): concept and experimental verification,” Appl. Opt. 40, 1806–1813 (2001).
    [Crossref]
  14. K. Venkataraman, D. Lelescu, J. Duparré, A. McMahon, G. Molina, P. Chatterjee, R. Mullis, and S. Nayar, “PiCam: an ultra-thin high performance monolithic camera array,” ACM Trans. Graph. 32, 166 (2013).
    [Crossref]
  15. G. Kim and R. Menon, “An ultra-small three dimensional computational microscope,” Appl. Phys. Lett. 105, 061114 (2014).
    [Crossref]
  16. W. Bishara, T. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18, 11181–11191 (2010).
    [Crossref]
  17. A. K. Cline, C. B. Moler, G. W. Stewart, and J. H. Wilkinson, “An estimate for the condition number of a matrix,” SIAM J. Numer. Anal. 16, 368–375 (1979).
    [Crossref]
  18. A. L. Cohen, “Anti-pinhole imaging,” Opt. Acta 29, 63–67 (1982).
    [Crossref]
  19. K. Kulkarni, “Reconstruction-free action inference from compressive imagers,” IEEE Trans. Pattern Anal. Mach. Intell. 38, 772–784 (2015).
    [Crossref]
  20. H. Chen, S. Jayasuriya, J. Yang, J. Stephen, S. Sivaramakrishnan, A. Veeraraghavan, and A. Molnar, “ASP vision: optically computing the first layer of convolutional neural networks using angle sensitive pixels,” arXiv: 1605.03621 (2016).
  21. J. A. Tropp and A. C. Gilbert, “Signal recovery from random measurements via orthogonal matching pursuit,” IEEE Trans. Inf. Theory 53, 4655–4666 (2007).
    [Crossref]

2016 (1)

P. Ryczkowski, M. Barbier, A. T. Friberg, J. M. Dudley, and G. Genty, “Ghost imaging in the time domain,” Nat. Photonics 10, 167–170 (2016).
[Crossref]

2015 (1)

K. Kulkarni, “Reconstruction-free action inference from compressive imagers,” IEEE Trans. Pattern Anal. Mach. Intell. 38, 772–784 (2015).
[Crossref]

2014 (3)

G. Kim and R. Menon, “An ultra-small three dimensional computational microscope,” Appl. Phys. Lett. 105, 061114 (2014).
[Crossref]

D. G. Stork and P. R. Gill, “Optical, mathematical and computational foundations of lensless ultra-miniature diffractive imagers and sensors,” Int. J. Adv. Syst. Meas. 7, 201–208 (2014).

H. Jiang, G. Huang, and P. Wilford, “Multi-view in lensless compressive imaging,” APSIPA Trans. Signal Inf. Process. 3, e15 (2014).
[Crossref]

2013 (2)

A. Bruckner and M. Schoberl, “Diffraction and photometric limits in today’s miniature digital camera systems,” Proc. SPIE 8616, 861617 (2013).
[Crossref]

K. Venkataraman, D. Lelescu, J. Duparré, A. McMahon, G. Molina, P. Chatterjee, R. Mullis, and S. Nayar, “PiCam: an ultra-thin high performance monolithic camera array,” ACM Trans. Graph. 32, 166 (2013).
[Crossref]

2011 (1)

S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011).
[Crossref]

2010 (1)

2007 (1)

J. A. Tropp and A. C. Gilbert, “Signal recovery from random measurements via orthogonal matching pursuit,” IEEE Trans. Inf. Theory 53, 4655–4666 (2007).
[Crossref]

2006 (1)

J. Bareau and P. P. Clark, “Design, fabrication and test of miniature plastic panomorph lenses with 180° field of view,” Proc. SPIE 6342, 63421F (2006).
[Crossref]

2001 (1)

1982 (1)

A. L. Cohen, “Anti-pinhole imaging,” Opt. Acta 29, 63–67 (1982).
[Crossref]

1980 (1)

T. M. Cannon and E. E. Fenimore, “Coded aperture imaging: many holes make light work,” Opt. Eng. 19, 193283 (1980).
[Crossref]

1979 (1)

A. K. Cline, C. B. Moler, G. W. Stewart, and J. H. Wilkinson, “An estimate for the condition number of a matrix,” SIAM J. Numer. Anal. 16, 368–375 (1979).
[Crossref]

Adcock, B.

B. Adcock, A. C. Hansen, C. Poon, and B. Roman, “Breaking the coherence barrier: a new theory for compressed sensing,” arXiv: 1302.0561 (2014).

Asif, M. S.

M. S. Asif, A. Ayremlou, A. Sankaranarayanan, A. Veeraraghavan, and R. G. Baraniuk, “FlatCam: thin, bare-sensor cameras using coded aperture and computation,” arXiv: 1509.00116v2 (2016).

Ayremlou, A.

M. S. Asif, A. Ayremlou, A. Sankaranarayanan, A. Veeraraghavan, and R. G. Baraniuk, “FlatCam: thin, bare-sensor cameras using coded aperture and computation,” arXiv: 1509.00116v2 (2016).

Baraniuk, R. G.

M. S. Asif, A. Ayremlou, A. Sankaranarayanan, A. Veeraraghavan, and R. G. Baraniuk, “FlatCam: thin, bare-sensor cameras using coded aperture and computation,” arXiv: 1509.00116v2 (2016).

Barbier, M.

P. Ryczkowski, M. Barbier, A. T. Friberg, J. M. Dudley, and G. Genty, “Ghost imaging in the time domain,” Nat. Photonics 10, 167–170 (2016).
[Crossref]

Bareau, J.

J. Bareau and P. P. Clark, “Design, fabrication and test of miniature plastic panomorph lenses with 180° field of view,” Proc. SPIE 6342, 63421F (2006).
[Crossref]

Bishara, W.

S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011).
[Crossref]

W. Bishara, T. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18, 11181–11191 (2010).
[Crossref]

Bruckner, A.

A. Bruckner and M. Schoberl, “Diffraction and photometric limits in today’s miniature digital camera systems,” Proc. SPIE 8616, 861617 (2013).
[Crossref]

Cannon, T. M.

T. M. Cannon and E. E. Fenimore, “Coded aperture imaging: many holes make light work,” Opt. Eng. 19, 193283 (1980).
[Crossref]

Chatterjee, P.

K. Venkataraman, D. Lelescu, J. Duparré, A. McMahon, G. Molina, P. Chatterjee, R. Mullis, and S. Nayar, “PiCam: an ultra-thin high performance monolithic camera array,” ACM Trans. Graph. 32, 166 (2013).
[Crossref]

Chen, H.

H. Chen, S. Jayasuriya, J. Yang, J. Stephen, S. Sivaramakrishnan, A. Veeraraghavan, and A. Molnar, “ASP vision: optically computing the first layer of convolutional neural networks using angle sensitive pixels,” arXiv: 1605.03621 (2016).

Clark, P. P.

J. Bareau and P. P. Clark, “Design, fabrication and test of miniature plastic panomorph lenses with 180° field of view,” Proc. SPIE 6342, 63421F (2006).
[Crossref]

Cline, A. K.

A. K. Cline, C. B. Moler, G. W. Stewart, and J. H. Wilkinson, “An estimate for the condition number of a matrix,” SIAM J. Numer. Anal. 16, 368–375 (1979).
[Crossref]

Cohen, A. L.

A. L. Cohen, “Anti-pinhole imaging,” Opt. Acta 29, 63–67 (1982).
[Crossref]

Coskun, A. F.

Dudley, J. M.

P. Ryczkowski, M. Barbier, A. T. Friberg, J. M. Dudley, and G. Genty, “Ghost imaging in the time domain,” Nat. Photonics 10, 167–170 (2016).
[Crossref]

Duparré, J.

K. Venkataraman, D. Lelescu, J. Duparré, A. McMahon, G. Molina, P. Chatterjee, R. Mullis, and S. Nayar, “PiCam: an ultra-thin high performance monolithic camera array,” ACM Trans. Graph. 32, 166 (2013).
[Crossref]

Feng, S.

S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011).
[Crossref]

Fenimore, E. E.

T. M. Cannon and E. E. Fenimore, “Coded aperture imaging: many holes make light work,” Opt. Eng. 19, 193283 (1980).
[Crossref]

Friberg, A. T.

P. Ryczkowski, M. Barbier, A. T. Friberg, J. M. Dudley, and G. Genty, “Ghost imaging in the time domain,” Nat. Photonics 10, 167–170 (2016).
[Crossref]

Genty, G.

P. Ryczkowski, M. Barbier, A. T. Friberg, J. M. Dudley, and G. Genty, “Ghost imaging in the time domain,” Nat. Photonics 10, 167–170 (2016).
[Crossref]

Gilbert, A. C.

J. A. Tropp and A. C. Gilbert, “Signal recovery from random measurements via orthogonal matching pursuit,” IEEE Trans. Inf. Theory 53, 4655–4666 (2007).
[Crossref]

Gill, P. R.

D. G. Stork and P. R. Gill, “Optical, mathematical and computational foundations of lensless ultra-miniature diffractive imagers and sensors,” Int. J. Adv. Syst. Meas. 7, 201–208 (2014).

P. R. Gill and D. G. Stork, “Lensless ultra miniature imagers using odd-symmetry phase gratings,” in Proceedings of Computational Optical Sensing and Imaging, Alexandria, Virginia, 2013.

Hansen, A. C.

B. Adcock, A. C. Hansen, C. Poon, and B. Roman, “Breaking the coherence barrier: a new theory for compressed sensing,” arXiv: 1302.0561 (2014).

Huang, G.

H. Jiang, G. Huang, and P. Wilford, “Multi-view in lensless compressive imaging,” APSIPA Trans. Signal Inf. Process. 3, e15 (2014).
[Crossref]

G. Huang, H. Jiang, K. Matthews, and P. Wilford, “Lensless imaging by compressive sensing,” in Proceedings of IEEE International Conference on Image Processing (IEEE, 2013), pp. 2101–2105.

X. Yuan, H. Jiang, G. Huang, and P. Wilford, “Lensless compressive imaging,” arXiv: 1508.03498 (2015).

Ichioka, Y.

Ishida, K.

Isikman, S. O.

S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011).
[Crossref]

Jayasuriya, S.

H. Chen, S. Jayasuriya, J. Yang, J. Stephen, S. Sivaramakrishnan, A. Veeraraghavan, and A. Molnar, “ASP vision: optically computing the first layer of convolutional neural networks using angle sensitive pixels,” arXiv: 1605.03621 (2016).

Jiang, H.

H. Jiang, G. Huang, and P. Wilford, “Multi-view in lensless compressive imaging,” APSIPA Trans. Signal Inf. Process. 3, e15 (2014).
[Crossref]

X. Yuan, H. Jiang, G. Huang, and P. Wilford, “Lensless compressive imaging,” arXiv: 1508.03498 (2015).

G. Huang, H. Jiang, K. Matthews, and P. Wilford, “Lensless imaging by compressive sensing,” in Proceedings of IEEE International Conference on Image Processing (IEEE, 2013), pp. 2101–2105.

Kim, G.

G. Kim and R. Menon, “An ultra-small three dimensional computational microscope,” Appl. Phys. Lett. 105, 061114 (2014).
[Crossref]

Kondou, N.

Kulkarni, K.

K. Kulkarni, “Reconstruction-free action inference from compressive imagers,” IEEE Trans. Pattern Anal. Mach. Intell. 38, 772–784 (2015).
[Crossref]

Kumagai, T.

Lau, R.

S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011).
[Crossref]

Lelescu, D.

K. Venkataraman, D. Lelescu, J. Duparré, A. McMahon, G. Molina, P. Chatterjee, R. Mullis, and S. Nayar, “PiCam: an ultra-thin high performance monolithic camera array,” ACM Trans. Graph. 32, 166 (2013).
[Crossref]

Matthews, K.

G. Huang, H. Jiang, K. Matthews, and P. Wilford, “Lensless imaging by compressive sensing,” in Proceedings of IEEE International Conference on Image Processing (IEEE, 2013), pp. 2101–2105.

Mavandadi, S.

S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011).
[Crossref]

McMahon, A.

K. Venkataraman, D. Lelescu, J. Duparré, A. McMahon, G. Molina, P. Chatterjee, R. Mullis, and S. Nayar, “PiCam: an ultra-thin high performance monolithic camera array,” ACM Trans. Graph. 32, 166 (2013).
[Crossref]

Menon, R.

G. Kim and R. Menon, “An ultra-small three dimensional computational microscope,” Appl. Phys. Lett. 105, 061114 (2014).
[Crossref]

Miyatake, S.

Miyazaki, D.

Moler, C. B.

A. K. Cline, C. B. Moler, G. W. Stewart, and J. H. Wilkinson, “An estimate for the condition number of a matrix,” SIAM J. Numer. Anal. 16, 368–375 (1979).
[Crossref]

Molina, G.

K. Venkataraman, D. Lelescu, J. Duparré, A. McMahon, G. Molina, P. Chatterjee, R. Mullis, and S. Nayar, “PiCam: an ultra-thin high performance monolithic camera array,” ACM Trans. Graph. 32, 166 (2013).
[Crossref]

Molnar, A.

H. Chen, S. Jayasuriya, J. Yang, J. Stephen, S. Sivaramakrishnan, A. Veeraraghavan, and A. Molnar, “ASP vision: optically computing the first layer of convolutional neural networks using angle sensitive pixels,” arXiv: 1605.03621 (2016).

Morimoto, T.

Mullis, R.

K. Venkataraman, D. Lelescu, J. Duparré, A. McMahon, G. Molina, P. Chatterjee, R. Mullis, and S. Nayar, “PiCam: an ultra-thin high performance monolithic camera array,” ACM Trans. Graph. 32, 166 (2013).
[Crossref]

Nayar, S.

K. Venkataraman, D. Lelescu, J. Duparré, A. McMahon, G. Molina, P. Chatterjee, R. Mullis, and S. Nayar, “PiCam: an ultra-thin high performance monolithic camera array,” ACM Trans. Graph. 32, 166 (2013).
[Crossref]

Ozcan, A.

S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011).
[Crossref]

W. Bishara, T. Su, A. F. Coskun, and A. Ozcan, “Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution,” Opt. Express 18, 11181–11191 (2010).
[Crossref]

Poon, C.

B. Adcock, A. C. Hansen, C. Poon, and B. Roman, “Breaking the coherence barrier: a new theory for compressed sensing,” arXiv: 1302.0561 (2014).

Roman, B.

B. Adcock, A. C. Hansen, C. Poon, and B. Roman, “Breaking the coherence barrier: a new theory for compressed sensing,” arXiv: 1302.0561 (2014).

Ryczkowski, P.

P. Ryczkowski, M. Barbier, A. T. Friberg, J. M. Dudley, and G. Genty, “Ghost imaging in the time domain,” Nat. Photonics 10, 167–170 (2016).
[Crossref]

Sankaranarayanan, A.

M. S. Asif, A. Ayremlou, A. Sankaranarayanan, A. Veeraraghavan, and R. G. Baraniuk, “FlatCam: thin, bare-sensor cameras using coded aperture and computation,” arXiv: 1509.00116v2 (2016).

Schoberl, M.

A. Bruckner and M. Schoberl, “Diffraction and photometric limits in today’s miniature digital camera systems,” Proc. SPIE 8616, 861617 (2013).
[Crossref]

Sivaramakrishnan, S.

H. Chen, S. Jayasuriya, J. Yang, J. Stephen, S. Sivaramakrishnan, A. Veeraraghavan, and A. Molnar, “ASP vision: optically computing the first layer of convolutional neural networks using angle sensitive pixels,” arXiv: 1605.03621 (2016).

Stephen, J.

H. Chen, S. Jayasuriya, J. Yang, J. Stephen, S. Sivaramakrishnan, A. Veeraraghavan, and A. Molnar, “ASP vision: optically computing the first layer of convolutional neural networks using angle sensitive pixels,” arXiv: 1605.03621 (2016).

Stewart, G. W.

A. K. Cline, C. B. Moler, G. W. Stewart, and J. H. Wilkinson, “An estimate for the condition number of a matrix,” SIAM J. Numer. Anal. 16, 368–375 (1979).
[Crossref]

Stork, D. G.

D. G. Stork and P. R. Gill, “Optical, mathematical and computational foundations of lensless ultra-miniature diffractive imagers and sensors,” Int. J. Adv. Syst. Meas. 7, 201–208 (2014).

P. R. Gill and D. G. Stork, “Lensless ultra miniature imagers using odd-symmetry phase gratings,” in Proceedings of Computational Optical Sensing and Imaging, Alexandria, Virginia, 2013.

Su, T.

Tanida, J.

Tropp, J. A.

J. A. Tropp and A. C. Gilbert, “Signal recovery from random measurements via orthogonal matching pursuit,” IEEE Trans. Inf. Theory 53, 4655–4666 (2007).
[Crossref]

Veeraraghavan, A.

H. Chen, S. Jayasuriya, J. Yang, J. Stephen, S. Sivaramakrishnan, A. Veeraraghavan, and A. Molnar, “ASP vision: optically computing the first layer of convolutional neural networks using angle sensitive pixels,” arXiv: 1605.03621 (2016).

M. S. Asif, A. Ayremlou, A. Sankaranarayanan, A. Veeraraghavan, and R. G. Baraniuk, “FlatCam: thin, bare-sensor cameras using coded aperture and computation,” arXiv: 1509.00116v2 (2016).

Venkataraman, K.

K. Venkataraman, D. Lelescu, J. Duparré, A. McMahon, G. Molina, P. Chatterjee, R. Mullis, and S. Nayar, “PiCam: an ultra-thin high performance monolithic camera array,” ACM Trans. Graph. 32, 166 (2013).
[Crossref]

Wilford, P.

H. Jiang, G. Huang, and P. Wilford, “Multi-view in lensless compressive imaging,” APSIPA Trans. Signal Inf. Process. 3, e15 (2014).
[Crossref]

G. Huang, H. Jiang, K. Matthews, and P. Wilford, “Lensless imaging by compressive sensing,” in Proceedings of IEEE International Conference on Image Processing (IEEE, 2013), pp. 2101–2105.

X. Yuan, H. Jiang, G. Huang, and P. Wilford, “Lensless compressive imaging,” arXiv: 1508.03498 (2015).

Wilkinson, J. H.

A. K. Cline, C. B. Moler, G. W. Stewart, and J. H. Wilkinson, “An estimate for the condition number of a matrix,” SIAM J. Numer. Anal. 16, 368–375 (1979).
[Crossref]

Yamada, K.

Yang, J.

H. Chen, S. Jayasuriya, J. Yang, J. Stephen, S. Sivaramakrishnan, A. Veeraraghavan, and A. Molnar, “ASP vision: optically computing the first layer of convolutional neural networks using angle sensitive pixels,” arXiv: 1605.03621 (2016).

Yu, F. W.

S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011).
[Crossref]

Yuan, X.

X. Yuan, H. Jiang, G. Huang, and P. Wilford, “Lensless compressive imaging,” arXiv: 1508.03498 (2015).

ACM Trans. Graph. (1)

K. Venkataraman, D. Lelescu, J. Duparré, A. McMahon, G. Molina, P. Chatterjee, R. Mullis, and S. Nayar, “PiCam: an ultra-thin high performance monolithic camera array,” ACM Trans. Graph. 32, 166 (2013).
[Crossref]

Appl. Opt. (1)

Appl. Phys. Lett. (1)

G. Kim and R. Menon, “An ultra-small three dimensional computational microscope,” Appl. Phys. Lett. 105, 061114 (2014).
[Crossref]

APSIPA Trans. Signal Inf. Process. (1)

H. Jiang, G. Huang, and P. Wilford, “Multi-view in lensless compressive imaging,” APSIPA Trans. Signal Inf. Process. 3, e15 (2014).
[Crossref]

IEEE Trans. Inf. Theory (1)

J. A. Tropp and A. C. Gilbert, “Signal recovery from random measurements via orthogonal matching pursuit,” IEEE Trans. Inf. Theory 53, 4655–4666 (2007).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (1)

K. Kulkarni, “Reconstruction-free action inference from compressive imagers,” IEEE Trans. Pattern Anal. Mach. Intell. 38, 772–784 (2015).
[Crossref]

Int. J. Adv. Syst. Meas. (1)

D. G. Stork and P. R. Gill, “Optical, mathematical and computational foundations of lensless ultra-miniature diffractive imagers and sensors,” Int. J. Adv. Syst. Meas. 7, 201–208 (2014).

Nat. Photonics (1)

P. Ryczkowski, M. Barbier, A. T. Friberg, J. M. Dudley, and G. Genty, “Ghost imaging in the time domain,” Nat. Photonics 10, 167–170 (2016).
[Crossref]

Opt. Acta (1)

A. L. Cohen, “Anti-pinhole imaging,” Opt. Acta 29, 63–67 (1982).
[Crossref]

Opt. Eng. (1)

T. M. Cannon and E. E. Fenimore, “Coded aperture imaging: many holes make light work,” Opt. Eng. 19, 193283 (1980).
[Crossref]

Opt. Express (1)

Proc. Natl. Acad. Sci. USA (1)

S. O. Isikman, W. Bishara, S. Mavandadi, F. W. Yu, S. Feng, R. Lau, and A. Ozcan, “Lens-free optical tomographic microscope with a large imaging volume on a chip,” Proc. Natl. Acad. Sci. USA 108, 7296–7301 (2011).
[Crossref]

Proc. SPIE (2)

J. Bareau and P. P. Clark, “Design, fabrication and test of miniature plastic panomorph lenses with 180° field of view,” Proc. SPIE 6342, 63421F (2006).
[Crossref]

A. Bruckner and M. Schoberl, “Diffraction and photometric limits in today’s miniature digital camera systems,” Proc. SPIE 8616, 861617 (2013).
[Crossref]

SIAM J. Numer. Anal. (1)

A. K. Cline, C. B. Moler, G. W. Stewart, and J. H. Wilkinson, “An estimate for the condition number of a matrix,” SIAM J. Numer. Anal. 16, 368–375 (1979).
[Crossref]

Other (6)

P. R. Gill and D. G. Stork, “Lensless ultra miniature imagers using odd-symmetry phase gratings,” in Proceedings of Computational Optical Sensing and Imaging, Alexandria, Virginia, 2013.

M. S. Asif, A. Ayremlou, A. Sankaranarayanan, A. Veeraraghavan, and R. G. Baraniuk, “FlatCam: thin, bare-sensor cameras using coded aperture and computation,” arXiv: 1509.00116v2 (2016).

X. Yuan, H. Jiang, G. Huang, and P. Wilford, “Lensless compressive imaging,” arXiv: 1508.03498 (2015).

B. Adcock, A. C. Hansen, C. Poon, and B. Roman, “Breaking the coherence barrier: a new theory for compressed sensing,” arXiv: 1302.0561 (2014).

G. Huang, H. Jiang, K. Matthews, and P. Wilford, “Lensless imaging by compressive sensing,” in Proceedings of IEEE International Conference on Image Processing (IEEE, 2013), pp. 2101–2105.

H. Chen, S. Jayasuriya, J. Yang, J. Stephen, S. Sivaramakrishnan, A. Veeraraghavan, and A. Molnar, “ASP vision: optically computing the first layer of convolutional neural networks using angle sensitive pixels,” arXiv: 1605.03621 (2016).

Supplementary Material (2)

NameDescription
» Visualization 1       Reconstructed video from lensless imager.
» Visualization 2       Imaging a video on a LCD screen.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1. Experiment setup and principles of operation. (a) Photograph of the setup showing LED matrix in front of the image sensor (top-right inset). (b) Sensor images are recorded for each LED in the array separately. Three exemplary calibration images are shown at three locations of the LED point source. (c) To test image reconstruction, a simple pattern is displayed on the LED matrix. The image captured by the sensor is input into the recovery algorithm to reconstruct the scene.
Fig. 2.
Fig. 2. Exemplary images taken with the sensor. The left column shows the objects displayed on the LED matrix. The second column shows the raw sensor images. The third column shows the reconstructed images before any processing. The right column shows the reconstructed images after binary thresholding. Video of a stickman is included as Visualization 1.
Fig. 3.
Fig. 3. Impact of object distance. (a) Reconstructed images obtained at various distances, D. (b) Length of a line object imaged at various distances, D, in the horizontal, vertical, and diagonal directions. Insets show reconstructed images of diagonal lines captured at the D = 85 , 162, and 242 mm. (c) Singular-value-decay plot of the calibration matrix A captured at various values of D. (d) Correlation matrix among the calibration images for D = 343    mm along the horizontal, vertical, and diagonal directions.
Fig. 4.
Fig. 4. Reconstruction of objects on LCD (iPad Mini, Apple) screen. (a) Photograph of setup and reconstruction results at D = 343    mm . (b) Reconstruction of a star in four different colors. Red, green, and blue color channels of the captured image were used to reconstruct red, green, or blue stars, respectively. All three color channels were averaged to obtain white star reconstruction. Insets show original displayed images. (c) Reconstruction of word “HI” at varying object locations. In all cases including (c), the same matrix A for D = 343    mm was used. Video of a moving arrow is included as Visualization 2.
Fig. 5.
Fig. 5. Detailed image of one LED located at the top-left corner shows three defects (possibly dust particles) on the sensor. These are indicated by the square outlines. The region marked with the red square is shown in the bottom for different LEDs, clearly indicating a shift of the “defect” scattering pattern.
Fig. 6.
Fig. 6. Enlarged CMOS image of the stickman (top), UTAH (middle), and space invader (bottom) objects. Conventional photographs of the LED are shown in the inset.
Fig. 7.
Fig. 7. Reconstructions are performed after removing sections of images where particle shadows are casted. Removed regions are marked by red boxes drawn on one of the calibration images. Results show successful image reconstruction without particle shadows. However, increased noise level is observed, which suggests that the particle shadow is suspected to carry image information, at least partially.
Fig. 8.
Fig. 8. Reconstruction of the stickman object using Tikhonov regularization (left) and orthogonal matching pursuit (second to last). Computation time is noted below each figure. K is number of non-zero elements used for OMP reconstruction. When proper K value was chosen, OMP produced cleaner images compared to Tikhonov regularization method at the expense of computation time.
Fig. 9.
Fig. 9. Lensless imaging performed using CCD sensor array. Objects similar to ones used in CMOS imaging were reconstructed. Even after cleaning, the cover glass of this sensor contained particles, which casted multiple shadow patterns that replicated displayed objects, which are visible to the naked eye in the raw data above. Combined with the calibration and reconstruction algorithm we proposed, reconstruction shows improved contrast and quality compared to raw shadow.
Fig. 10.
Fig. 10. Simulation of the lensless imager.
Fig. 11.
Fig. 11. (a) Schematic of the impact of D. (b) Map of PSF l 2 norms as a function of position in the object plane. The dark lines at the bottom are due to the holder used for the LED matrix.

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

κ ( A ) = max ( A 1 e / A 1 b e / e ) = σ max σ min ,
x ^ = argmin Ax b 2 2 + α 2 x 2 2 ,

Metrics