Abstract

The challenge in rendering integral images is to use as much information preserved by the light field as possible to reconstruct a captured scene in a three-dimensional way. We propose a rendering algorithm based on the projection of rays through a detailed simulation of the optical path, considering all the physical properties and locations of the optical elements. The rendered images contain information about the correct size of imaged objects without the need to calibrate the imaging device. Additionally, aberrations of the optical system may be corrected, depending on the setup of the integral imaging device. We show simulation data that illustrates the aberration correction ability and experimental data from our plenoptic camera, which illustrates the capability of our proposed algorithm to measure size and distance. We believe this rendering procedure will be useful in the future for three-dimensional ophthalmic imaging of the human retina.

© 2014 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express 14, 12096–12108 (2006).
    [CrossRef]
  2. M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
    [CrossRef]
  3. T. Georgiev, Z. Yu, A. Lumsdaine, and S. Goma, “Lytro camera technology: theory, algorithms, performance analysis,” Proc. SPIE 8667, 86671J (2013).
    [CrossRef]
  4. M. G. Lippmann, “Epreuves reversibles, photographies integrales,” J. Phys. A 7, 821–825 (1908).
  5. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52, 546–560 (2013).
    [CrossRef]
  6. J.-H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. 48, H77–H49 (2009).
    [CrossRef]
  7. G. Wetzstein, I. Ihrke, D. Lanman, W. Heidrich, R. Raskar, and K. Akeley, “Computational plenoptic imaging,” ACM SIGGRAPH 2012 Courses 92206, 2397–2426 (2012).
  8. T. Georgiev and A. Lunmsdaine, “Focused plenoptic camera and rendering,” J. Electron. Imaging 19, 021106 (2010).
    [CrossRef]
  9. T. E. Bishop and P. Favero, “Full-resolution depth map estimation from an aliased plenoptic light field,” in 10th Asian Conference on Computer Vision, 6493 (2011), pp. 186–200.
  10. S.-H. Hong, J.-S. Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging,” Opt. Express 12, 483–491 (2004).
    [CrossRef]
  11. G. Passalis, N. Sgouros, S. Athineos, and T. Theoharis, “Enhanced reconstruction of three-dimensional shape and texture from integral photography images,” Appl. Opt. 46, 5311–5320 (2007).
    [CrossRef]
  12. R. Ng and P. Hanrahan, “Digital correction of lens aberrations in light field photography,” Proc. SPIE 6342, 63421E (2006).
    [CrossRef]
  13. A. Lumsdaine and T. Georgiev, “The focused plenoptic camera,” in Proceedings of IEEE International Conference on Computational Photography (ICCP, 2009).
  14. FRED Optimum, Photon Engineering LCC, http://www.photonengr.com/
  15. C. H. Wu, “Depth measurement in integral images,” Ph.D. dissertation (De Montfort University, 2003).
  16. T. Georgiev and A. Lunmsdaine, “Depth of field in plenoptic cameras,” presented at Proceedings of Eurographics, Munich, Germany, (2009).

2013

T. Georgiev, Z. Yu, A. Lumsdaine, and S. Goma, “Lytro camera technology: theory, algorithms, performance analysis,” Proc. SPIE 8667, 86671J (2013).
[CrossRef]

X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52, 546–560 (2013).
[CrossRef]

2012

G. Wetzstein, I. Ihrke, D. Lanman, W. Heidrich, R. Raskar, and K. Akeley, “Computational plenoptic imaging,” ACM SIGGRAPH 2012 Courses 92206, 2397–2426 (2012).

2010

T. Georgiev and A. Lunmsdaine, “Focused plenoptic camera and rendering,” J. Electron. Imaging 19, 021106 (2010).
[CrossRef]

2009

2007

2006

R. Ng and P. Hanrahan, “Digital correction of lens aberrations in light field photography,” Proc. SPIE 6342, 63421E (2006).
[CrossRef]

B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express 14, 12096–12108 (2006).
[CrossRef]

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[CrossRef]

2004

1908

M. G. Lippmann, “Epreuves reversibles, photographies integrales,” J. Phys. A 7, 821–825 (1908).

Adams, A.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[CrossRef]

Akeley, K.

G. Wetzstein, I. Ihrke, D. Lanman, W. Heidrich, R. Raskar, and K. Akeley, “Computational plenoptic imaging,” ACM SIGGRAPH 2012 Courses 92206, 2397–2426 (2012).

Athineos, S.

Bishop, T. E.

T. E. Bishop and P. Favero, “Full-resolution depth map estimation from an aliased plenoptic light field,” in 10th Asian Conference on Computer Vision, 6493 (2011), pp. 186–200.

Favero, P.

T. E. Bishop and P. Favero, “Full-resolution depth map estimation from an aliased plenoptic light field,” in 10th Asian Conference on Computer Vision, 6493 (2011), pp. 186–200.

Footer, M.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[CrossRef]

Georgiev, T.

T. Georgiev, Z. Yu, A. Lumsdaine, and S. Goma, “Lytro camera technology: theory, algorithms, performance analysis,” Proc. SPIE 8667, 86671J (2013).
[CrossRef]

T. Georgiev and A. Lunmsdaine, “Focused plenoptic camera and rendering,” J. Electron. Imaging 19, 021106 (2010).
[CrossRef]

A. Lumsdaine and T. Georgiev, “The focused plenoptic camera,” in Proceedings of IEEE International Conference on Computational Photography (ICCP, 2009).

T. Georgiev and A. Lunmsdaine, “Depth of field in plenoptic cameras,” presented at Proceedings of Eurographics, Munich, Germany, (2009).

Goma, S.

T. Georgiev, Z. Yu, A. Lumsdaine, and S. Goma, “Lytro camera technology: theory, algorithms, performance analysis,” Proc. SPIE 8667, 86671J (2013).
[CrossRef]

Hanrahan, P.

R. Ng and P. Hanrahan, “Digital correction of lens aberrations in light field photography,” Proc. SPIE 6342, 63421E (2006).
[CrossRef]

Heidrich, W.

G. Wetzstein, I. Ihrke, D. Lanman, W. Heidrich, R. Raskar, and K. Akeley, “Computational plenoptic imaging,” ACM SIGGRAPH 2012 Courses 92206, 2397–2426 (2012).

Hong, K.

Hong, S.-H.

Horowitz, M.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[CrossRef]

Ihrke, I.

G. Wetzstein, I. Ihrke, D. Lanman, W. Heidrich, R. Raskar, and K. Akeley, “Computational plenoptic imaging,” ACM SIGGRAPH 2012 Courses 92206, 2397–2426 (2012).

Jang, J.-S.

Javidi, B.

Lanman, D.

G. Wetzstein, I. Ihrke, D. Lanman, W. Heidrich, R. Raskar, and K. Akeley, “Computational plenoptic imaging,” ACM SIGGRAPH 2012 Courses 92206, 2397–2426 (2012).

Lee, B.

Levoy, M.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[CrossRef]

Lippmann, M. G.

M. G. Lippmann, “Epreuves reversibles, photographies integrales,” J. Phys. A 7, 821–825 (1908).

Lumsdaine, A.

T. Georgiev, Z. Yu, A. Lumsdaine, and S. Goma, “Lytro camera technology: theory, algorithms, performance analysis,” Proc. SPIE 8667, 86671J (2013).
[CrossRef]

A. Lumsdaine and T. Georgiev, “The focused plenoptic camera,” in Proceedings of IEEE International Conference on Computational Photography (ICCP, 2009).

Lunmsdaine, A.

T. Georgiev and A. Lunmsdaine, “Focused plenoptic camera and rendering,” J. Electron. Imaging 19, 021106 (2010).
[CrossRef]

T. Georgiev and A. Lunmsdaine, “Depth of field in plenoptic cameras,” presented at Proceedings of Eurographics, Munich, Germany, (2009).

Martinez-Corral, M.

Moon, I.

Ng, R.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[CrossRef]

R. Ng and P. Hanrahan, “Digital correction of lens aberrations in light field photography,” Proc. SPIE 6342, 63421E (2006).
[CrossRef]

Park, J.-H.

Passalis, G.

Raskar, R.

G. Wetzstein, I. Ihrke, D. Lanman, W. Heidrich, R. Raskar, and K. Akeley, “Computational plenoptic imaging,” ACM SIGGRAPH 2012 Courses 92206, 2397–2426 (2012).

Sgouros, N.

Stern, A.

Theoharis, T.

Wetzstein, G.

G. Wetzstein, I. Ihrke, D. Lanman, W. Heidrich, R. Raskar, and K. Akeley, “Computational plenoptic imaging,” ACM SIGGRAPH 2012 Courses 92206, 2397–2426 (2012).

Wu, C. H.

C. H. Wu, “Depth measurement in integral images,” Ph.D. dissertation (De Montfort University, 2003).

Xiao, X.

Yeom, S.

Yu, Z.

T. Georgiev, Z. Yu, A. Lumsdaine, and S. Goma, “Lytro camera technology: theory, algorithms, performance analysis,” Proc. SPIE 8667, 86671J (2013).
[CrossRef]

ACM SIGGRAPH 2012 Courses

G. Wetzstein, I. Ihrke, D. Lanman, W. Heidrich, R. Raskar, and K. Akeley, “Computational plenoptic imaging,” ACM SIGGRAPH 2012 Courses 92206, 2397–2426 (2012).

ACM Trans. Graph.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[CrossRef]

Appl. Opt.

J. Electron. Imaging

T. Georgiev and A. Lunmsdaine, “Focused plenoptic camera and rendering,” J. Electron. Imaging 19, 021106 (2010).
[CrossRef]

J. Phys. A

M. G. Lippmann, “Epreuves reversibles, photographies integrales,” J. Phys. A 7, 821–825 (1908).

Opt. Express

Proc. SPIE

R. Ng and P. Hanrahan, “Digital correction of lens aberrations in light field photography,” Proc. SPIE 6342, 63421E (2006).
[CrossRef]

T. Georgiev, Z. Yu, A. Lumsdaine, and S. Goma, “Lytro camera technology: theory, algorithms, performance analysis,” Proc. SPIE 8667, 86671J (2013).
[CrossRef]

Other

T. E. Bishop and P. Favero, “Full-resolution depth map estimation from an aliased plenoptic light field,” in 10th Asian Conference on Computer Vision, 6493 (2011), pp. 186–200.

A. Lumsdaine and T. Georgiev, “The focused plenoptic camera,” in Proceedings of IEEE International Conference on Computational Photography (ICCP, 2009).

FRED Optimum, Photon Engineering LCC, http://www.photonengr.com/

C. H. Wu, “Depth measurement in integral images,” Ph.D. dissertation (De Montfort University, 2003).

T. Georgiev and A. Lunmsdaine, “Depth of field in plenoptic cameras,” presented at Proceedings of Eurographics, Munich, Germany, (2009).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1.

Schematic view of the capturing process of the plenoptic camera: The character “3” (a) is imaged by the main lens (b). Every lens of the microlens array (d) images a specific view of the intermediate image of the “3” (c) on the sensor (e).

Fig. 2.
Fig. 2.

Picture detail of a high-precision millimeter scale. (a) Image a standard camera would capture. (b) Integral image of the same detail. Different parts of the characters “3” and “2” are imaged behind different microlenses.

Fig. 3.
Fig. 3.

Simulation results for imaging through a tilted lens. (a) The “3” as light source. (b) The picture a standard camera would gather. (c) The back-projected image of the standard camera amplifies the aberration. (d) The back-projected image of the plenoptic camera through the whole optical system, showing the corrected aberration. (e) The back-projected intermediate image of the plenoptic camera, still blurred. (f) The back-projected image (d) masked with the light source (a) shows the binary difference.

Fig. 4.
Fig. 4.

Image of a millimeter scale, rendered by projection to perform size measurements. The scale was located 60 mm away from the main lens.

Fig. 5.
Fig. 5.

Image-derived relative distance of the target compared to the linear stage distance.

Fig. 6.
Fig. 6.

Change of elemental images during the distance measurements.

Fig. 7.
Fig. 7.

Image-derived relative distance of the target compared to the linear stage distance for the most reliable range in 0.1 mm steps.

Metrics