Abstract

A reciprocal 360-deg three-dimensional light-field image acquisition and display system was designed using a common catadioptric optical configuration and a lens array. Proof-of-concept experimental setups were constructed with a full capturing part and a truncated display section to demonstrate that the proposed design works without loss of generality. Unlike conventional setups, which record and display rectangular volumes, the proposed configuration records 3D images from its surrounding spherical volume in the capture mode and projects 3D images to the same spherical volume in the display mode. This is particularly advantageous in comparison to other 360-deg multi-camera and multiple projector display systems that require extensive image and physical calibration. We analyzed the system and showed quality measures such as angular resolution and space bandwidth product based on design parameters. The issue due to pixel size difference between the available imaging sensor and the display was also addressed. A diffractive microlens array matching the sensor size was used in the acquisition part, whereas a vacuum cast lens array matching the display size was used in the display part with scaled optics. The experimental results demonstrate the proposed system design works well and is in good agreement with the simulation results.

© 2019 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
360-degree large-scale multiprojection light-field 3D display system

Lixia Ni, Zhenxing Li, Haifeng Li, and Xu Liu
Appl. Opt. 57(8) 1817-1823 (2018)

Fundamentals of 3D imaging and displays: a tutorial on integral imaging, light-field, and plenoptic systems

Manuel Martínez-Corral and Bahram Javidi
Adv. Opt. Photon. 10(3) 512-566 (2018)

360-degree tabletop 3D light-field display with ring-shaped viewing range based on aspheric conical lens array

Xunbo Yu, Xinzhu Sang, Xin Gao, Binbin Yan, Duo Chen, Boyang Liu, Li Liu, Chao Gao, and Peiren Wang
Opt. Express 27(19) 26738-26748 (2019)

References

  • View by:
  • |
  • |
  • |

  1. D. Gabor, “A new microscopic principle,” Nature 161, 777–778 (1948).
    [Crossref]
  2. P. Benzie, J. Watson, P. Surman, I. Rakkolainen, K. Hopf, H. Urey, V. Sainov, and C. von Kopylow, “A survey of 3DTV displays: techniques and technologies,” IEEE Trans. Circuits Syst. Video Technol. 17, 1647–1658 (2007).
    [Crossref]
  3. G. Lippmann, “La photographie integrale,” C.R. Hebd. Seances Acad. Sci. 146, 446–451 (1908).
  4. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52, 546–560 (2013).
    [Crossref]
  5. A. O. Yontem and L. Onural, “Integral imaging based 3D display of holographic data,” Opt. Express 20, 24175–24195 (2012).
    [Crossref]
  6. M. Levoy and P. Hanrahan, “Light field rendering,” in 23rd Annual Conference on Computer Graphics and Interactive Techniques (ACM, 1996), pp. 31–42.
  7. E. Y. Lam, “Computational photography with plenoptic camera and light field capture: tutorial,” J. Opt. Soc. Am. A 32, 2021–2032 (2015).
    [Crossref]
  8. M. Martínez-Corral and B. Javidi, “Fundamentals of 3D imaging and displays: a tutorial on integral imaging, light-field, and plenoptic systems,” Adv. Opt. Photon. 10, 512–566 (2018).
    [Crossref]
  9. J. Y. Son, H. Lee, B. R. Lee, and K. H. Lee, “Holographic and light-field imaging as future 3-D displays,” Proc. IEEE 105, 789–804 (2017).
    [Crossref]
  10. N. Balram, “Fundamentals of light field imaging and display systems,” in Display Week (SID, 2016).
  11. E. Sahin and L. Onural, “A comparative study of light field representation and integral imaging,” Imaging Sci. J. 58, 28–31 (2010).
    [Crossref]
  12. A. Lumsdaine and T. Georgiev, “The focused plenoptic camera,” in IEEE International Conference on Computational Photography (ICCP)(2009).
  13. J. Tumblin, “Dappled photography: mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69 (2007).
    [Crossref]
  14. A. O. Yontem and D. Chu, “Design of micro photon sieve arrays for high resolution light-field capture in plenoptic cameras,” in 3DTV-Conference: The True Vision—Capture, Transmission and Display of 3D Video (3DTV-CON) (2018).
  15. X. Liu and H. Li, “The progress of light field 3-D displays,” Inf. Disp. 30, 6–14 (2014).
  16. G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting,” ACM Trans. Graph. 31, 80 (2012).
    [Crossref]
  17. R. Ng, M. Levoy, M. Br’edif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a handheld plenoptic camera,” (2005).
  18. F. Okano, H. Hoshino, J. Arai, and I. Yuyama, “Real-time pickup method for a three-dimensional image based on integral photography,” Appl. Opt. 36, 1598–1603 (1997).
    [Crossref]
  19. Y. Jeong, J. Kim, J. Yeom, C.-K. Lee, and B. Lee, “Real-time depth controllable integral imaging pickup and reconstruction method with a light field camera,” Appl. Opt. 54, 10333–10341 (2015).
    [Crossref]
  20. J. Kim, S. Moon, Y. Jeong, C. Jang, Y. Kim, and B. Lee, “Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display,” J. Biomed. Opt. 23, 066502 (2018).
    [Crossref]
  21. Filmora, 2018, https://filmora.wondershare.com/virtual-reality/top-10-professional-360-degree-cameras.html .
  22. G. Krishnan and S. K. Nayar, “Towards a true spherical camera,” Proc. SPIE 7240, 724002 (2009).
    [Crossref]
  23. L. Onural, “Design of a 360-degree holographic 3D video display using commonly available display panels and a paraboloid mirror,” Proc. SPIE 10126, 101260I (2017).
    [Crossref]
  24. T. Kawakami, M. Date, M. Sasai, and H. Takada, “360-degree screen-free floating 3D image in a crystal ball using a spatially imaged iris and rotational multiview DFD technologies,” Appl. Opt. 56, 6156–6167 (2017).
    [Crossref]
  25. A. O. Yöntem and D. Chu, “Design for 360-degree 3D light-field camera and display,” in Imaging and Applied Optics (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2018), paper 3Tu5G.6.
  26. Y. Jeong, S. Moon, J. Jeong, G. Li, J. Cho, and B. Lee, “One shot 360-degree light field capture and reconstruction with depth extraction based on optical flow for light field camera,” Appl. Sci. 8, 890 (2018).
    [Crossref]
  27. H. Todoroki and H. Saito, “Light field rendering with omni-directional camera,” Proc. SPIE 5150, 1159–1168 (2003).
    [Crossref]
  28. Y. Sando, K. Satoh, T. Kitagawa, M. Kawamura, D. Barada, and T. Yatagai, “Super-wide viewing-zone holographic 3D display using a convex parabolic mirror,” Sci. Rep. 8, 11333 (2018).
    [Crossref]
  29. B. Braker and E. D. Moore, “Catadioptric projector systems, devices, and methods,” U.S. patent application20170023780A1 (January1, 2017).
  30. J.-S. Jang and B. Javidi, “Three-dimensional integral imaging with electronically synthesized lenslet arrays,” Opt. Lett. 27, 1767–1769 (2002).
    [Crossref]
  31. M. Hain, W. Spiegel, M. Schmiedchen, T. Tschudi, and B. Javidi, “3D integral imaging using diffractive Fresnel lens arrays,” Opt. Express 13, 315–326 (2005).
    [Crossref]
  32. P. Nussbaum, R. Volkel, H. P. Herzig, M. Eisner, and S. Haselbeck, “Design, fabrication and testing of microlens arrays for sensors and microsystems,” Pure Appl. Opt. 6, 617–636 (1997).
    [Crossref]
  33. M. Yamaguchi, “Light-field and holographic three-dimensional displays [Invited],” J. Opt. Soc. Am. A 33, 2348–2364 (2016).
    [Crossref]
  34. D. Mendlovic and A. W. Lohmann, “Space-bandwidth product adaptation and its application to superresolution: fundamentals,” J. Opt. Soc. Am. A 14, 558–562 (1997).
    [Crossref]
  35. B. L. Reddy, M. P. Phinehas, and A. Nelleri, “Space bandwidth product analysis of digital holography and image reconstruction process,” in International Conference on Nextgen Electronic Technologies: Silicon to Software (ICNETS2), Chennai, India, March2017, pp. 194–197.
  36. F. Yaras, H. Kang, and L. Onural, “Multi-SLM holographic display system with planar configuration,” in 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, Tampere Finland, June2010, pp. 1–4.
  37. A. O. Yontem, “Three-dimensional integral imaging based capture and display system using digital programmable Fresnel lenslet arrays,” Ph.D. thesis (Bilkent University, 2012).
  38. J.-H. Lee, J. Park, D. Nam, S. Young Choi, D.-S. Park, and C. Y. Kim, “Optimal projector configuration design for 300-Mpixel multi-projection 3D display,” Opt. Express 21, 26820–26835 (2013).
    [Crossref]
  39. R. Bregovic, P. T. Kovács, T. Balogh, and A. Gotchev, “Display-specific light-field analysis,” Proc. SPIE 9117, 911710 (2009).
    [Crossref]
  40. Apotelyth, 2018, https://www.apotelyt.com/compare-camera/sony-a7r-ii-vs-sony-a850 .

2018 (4)

J. Kim, S. Moon, Y. Jeong, C. Jang, Y. Kim, and B. Lee, “Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display,” J. Biomed. Opt. 23, 066502 (2018).
[Crossref]

Y. Jeong, S. Moon, J. Jeong, G. Li, J. Cho, and B. Lee, “One shot 360-degree light field capture and reconstruction with depth extraction based on optical flow for light field camera,” Appl. Sci. 8, 890 (2018).
[Crossref]

Y. Sando, K. Satoh, T. Kitagawa, M. Kawamura, D. Barada, and T. Yatagai, “Super-wide viewing-zone holographic 3D display using a convex parabolic mirror,” Sci. Rep. 8, 11333 (2018).
[Crossref]

M. Martínez-Corral and B. Javidi, “Fundamentals of 3D imaging and displays: a tutorial on integral imaging, light-field, and plenoptic systems,” Adv. Opt. Photon. 10, 512–566 (2018).
[Crossref]

2017 (3)

T. Kawakami, M. Date, M. Sasai, and H. Takada, “360-degree screen-free floating 3D image in a crystal ball using a spatially imaged iris and rotational multiview DFD technologies,” Appl. Opt. 56, 6156–6167 (2017).
[Crossref]

L. Onural, “Design of a 360-degree holographic 3D video display using commonly available display panels and a paraboloid mirror,” Proc. SPIE 10126, 101260I (2017).
[Crossref]

J. Y. Son, H. Lee, B. R. Lee, and K. H. Lee, “Holographic and light-field imaging as future 3-D displays,” Proc. IEEE 105, 789–804 (2017).
[Crossref]

2016 (1)

2015 (2)

2014 (1)

X. Liu and H. Li, “The progress of light field 3-D displays,” Inf. Disp. 30, 6–14 (2014).

2013 (2)

2012 (2)

A. O. Yontem and L. Onural, “Integral imaging based 3D display of holographic data,” Opt. Express 20, 24175–24195 (2012).
[Crossref]

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting,” ACM Trans. Graph. 31, 80 (2012).
[Crossref]

2010 (1)

E. Sahin and L. Onural, “A comparative study of light field representation and integral imaging,” Imaging Sci. J. 58, 28–31 (2010).
[Crossref]

2009 (2)

G. Krishnan and S. K. Nayar, “Towards a true spherical camera,” Proc. SPIE 7240, 724002 (2009).
[Crossref]

R. Bregovic, P. T. Kovács, T. Balogh, and A. Gotchev, “Display-specific light-field analysis,” Proc. SPIE 9117, 911710 (2009).
[Crossref]

2007 (2)

J. Tumblin, “Dappled photography: mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69 (2007).
[Crossref]

P. Benzie, J. Watson, P. Surman, I. Rakkolainen, K. Hopf, H. Urey, V. Sainov, and C. von Kopylow, “A survey of 3DTV displays: techniques and technologies,” IEEE Trans. Circuits Syst. Video Technol. 17, 1647–1658 (2007).
[Crossref]

2005 (1)

2003 (1)

H. Todoroki and H. Saito, “Light field rendering with omni-directional camera,” Proc. SPIE 5150, 1159–1168 (2003).
[Crossref]

2002 (1)

1997 (3)

1948 (1)

D. Gabor, “A new microscopic principle,” Nature 161, 777–778 (1948).
[Crossref]

1908 (1)

G. Lippmann, “La photographie integrale,” C.R. Hebd. Seances Acad. Sci. 146, 446–451 (1908).

Arai, J.

Balogh, T.

R. Bregovic, P. T. Kovács, T. Balogh, and A. Gotchev, “Display-specific light-field analysis,” Proc. SPIE 9117, 911710 (2009).
[Crossref]

Balram, N.

N. Balram, “Fundamentals of light field imaging and display systems,” in Display Week (SID, 2016).

Barada, D.

Y. Sando, K. Satoh, T. Kitagawa, M. Kawamura, D. Barada, and T. Yatagai, “Super-wide viewing-zone holographic 3D display using a convex parabolic mirror,” Sci. Rep. 8, 11333 (2018).
[Crossref]

Benzie, P.

P. Benzie, J. Watson, P. Surman, I. Rakkolainen, K. Hopf, H. Urey, V. Sainov, and C. von Kopylow, “A survey of 3DTV displays: techniques and technologies,” IEEE Trans. Circuits Syst. Video Technol. 17, 1647–1658 (2007).
[Crossref]

Br’edif, M.

R. Ng, M. Levoy, M. Br’edif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a handheld plenoptic camera,” (2005).

Braker, B.

B. Braker and E. D. Moore, “Catadioptric projector systems, devices, and methods,” U.S. patent application20170023780A1 (January1, 2017).

Bregovic, R.

R. Bregovic, P. T. Kovács, T. Balogh, and A. Gotchev, “Display-specific light-field analysis,” Proc. SPIE 9117, 911710 (2009).
[Crossref]

Cho, J.

Y. Jeong, S. Moon, J. Jeong, G. Li, J. Cho, and B. Lee, “One shot 360-degree light field capture and reconstruction with depth extraction based on optical flow for light field camera,” Appl. Sci. 8, 890 (2018).
[Crossref]

Chu, D.

A. O. Yöntem and D. Chu, “Design for 360-degree 3D light-field camera and display,” in Imaging and Applied Optics (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2018), paper 3Tu5G.6.

A. O. Yontem and D. Chu, “Design of micro photon sieve arrays for high resolution light-field capture in plenoptic cameras,” in 3DTV-Conference: The True Vision—Capture, Transmission and Display of 3D Video (3DTV-CON) (2018).

Date, M.

Duval, G.

R. Ng, M. Levoy, M. Br’edif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a handheld plenoptic camera,” (2005).

Eisner, M.

P. Nussbaum, R. Volkel, H. P. Herzig, M. Eisner, and S. Haselbeck, “Design, fabrication and testing of microlens arrays for sensors and microsystems,” Pure Appl. Opt. 6, 617–636 (1997).
[Crossref]

Gabor, D.

D. Gabor, “A new microscopic principle,” Nature 161, 777–778 (1948).
[Crossref]

Georgiev, T.

A. Lumsdaine and T. Georgiev, “The focused plenoptic camera,” in IEEE International Conference on Computational Photography (ICCP)(2009).

Gotchev, A.

R. Bregovic, P. T. Kovács, T. Balogh, and A. Gotchev, “Display-specific light-field analysis,” Proc. SPIE 9117, 911710 (2009).
[Crossref]

Hain, M.

Hanrahan, P.

R. Ng, M. Levoy, M. Br’edif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a handheld plenoptic camera,” (2005).

M. Levoy and P. Hanrahan, “Light field rendering,” in 23rd Annual Conference on Computer Graphics and Interactive Techniques (ACM, 1996), pp. 31–42.

Haselbeck, S.

P. Nussbaum, R. Volkel, H. P. Herzig, M. Eisner, and S. Haselbeck, “Design, fabrication and testing of microlens arrays for sensors and microsystems,” Pure Appl. Opt. 6, 617–636 (1997).
[Crossref]

Herzig, H. P.

P. Nussbaum, R. Volkel, H. P. Herzig, M. Eisner, and S. Haselbeck, “Design, fabrication and testing of microlens arrays for sensors and microsystems,” Pure Appl. Opt. 6, 617–636 (1997).
[Crossref]

Hirsch, M.

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting,” ACM Trans. Graph. 31, 80 (2012).
[Crossref]

Hopf, K.

P. Benzie, J. Watson, P. Surman, I. Rakkolainen, K. Hopf, H. Urey, V. Sainov, and C. von Kopylow, “A survey of 3DTV displays: techniques and technologies,” IEEE Trans. Circuits Syst. Video Technol. 17, 1647–1658 (2007).
[Crossref]

Horowitz, M.

R. Ng, M. Levoy, M. Br’edif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a handheld plenoptic camera,” (2005).

Hoshino, H.

Jang, C.

J. Kim, S. Moon, Y. Jeong, C. Jang, Y. Kim, and B. Lee, “Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display,” J. Biomed. Opt. 23, 066502 (2018).
[Crossref]

Jang, J.-S.

Javidi, B.

Jeong, J.

Y. Jeong, S. Moon, J. Jeong, G. Li, J. Cho, and B. Lee, “One shot 360-degree light field capture and reconstruction with depth extraction based on optical flow for light field camera,” Appl. Sci. 8, 890 (2018).
[Crossref]

Jeong, Y.

Y. Jeong, S. Moon, J. Jeong, G. Li, J. Cho, and B. Lee, “One shot 360-degree light field capture and reconstruction with depth extraction based on optical flow for light field camera,” Appl. Sci. 8, 890 (2018).
[Crossref]

J. Kim, S. Moon, Y. Jeong, C. Jang, Y. Kim, and B. Lee, “Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display,” J. Biomed. Opt. 23, 066502 (2018).
[Crossref]

Y. Jeong, J. Kim, J. Yeom, C.-K. Lee, and B. Lee, “Real-time depth controllable integral imaging pickup and reconstruction method with a light field camera,” Appl. Opt. 54, 10333–10341 (2015).
[Crossref]

Kang, H.

F. Yaras, H. Kang, and L. Onural, “Multi-SLM holographic display system with planar configuration,” in 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, Tampere Finland, June2010, pp. 1–4.

Kawakami, T.

Kawamura, M.

Y. Sando, K. Satoh, T. Kitagawa, M. Kawamura, D. Barada, and T. Yatagai, “Super-wide viewing-zone holographic 3D display using a convex parabolic mirror,” Sci. Rep. 8, 11333 (2018).
[Crossref]

Kim, C. Y.

Kim, J.

J. Kim, S. Moon, Y. Jeong, C. Jang, Y. Kim, and B. Lee, “Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display,” J. Biomed. Opt. 23, 066502 (2018).
[Crossref]

Y. Jeong, J. Kim, J. Yeom, C.-K. Lee, and B. Lee, “Real-time depth controllable integral imaging pickup and reconstruction method with a light field camera,” Appl. Opt. 54, 10333–10341 (2015).
[Crossref]

Kim, Y.

J. Kim, S. Moon, Y. Jeong, C. Jang, Y. Kim, and B. Lee, “Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display,” J. Biomed. Opt. 23, 066502 (2018).
[Crossref]

Kitagawa, T.

Y. Sando, K. Satoh, T. Kitagawa, M. Kawamura, D. Barada, and T. Yatagai, “Super-wide viewing-zone holographic 3D display using a convex parabolic mirror,” Sci. Rep. 8, 11333 (2018).
[Crossref]

Kovács, P. T.

R. Bregovic, P. T. Kovács, T. Balogh, and A. Gotchev, “Display-specific light-field analysis,” Proc. SPIE 9117, 911710 (2009).
[Crossref]

Krishnan, G.

G. Krishnan and S. K. Nayar, “Towards a true spherical camera,” Proc. SPIE 7240, 724002 (2009).
[Crossref]

Lam, E. Y.

Lanman, D.

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting,” ACM Trans. Graph. 31, 80 (2012).
[Crossref]

Lee, B.

J. Kim, S. Moon, Y. Jeong, C. Jang, Y. Kim, and B. Lee, “Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display,” J. Biomed. Opt. 23, 066502 (2018).
[Crossref]

Y. Jeong, S. Moon, J. Jeong, G. Li, J. Cho, and B. Lee, “One shot 360-degree light field capture and reconstruction with depth extraction based on optical flow for light field camera,” Appl. Sci. 8, 890 (2018).
[Crossref]

Y. Jeong, J. Kim, J. Yeom, C.-K. Lee, and B. Lee, “Real-time depth controllable integral imaging pickup and reconstruction method with a light field camera,” Appl. Opt. 54, 10333–10341 (2015).
[Crossref]

Lee, B. R.

J. Y. Son, H. Lee, B. R. Lee, and K. H. Lee, “Holographic and light-field imaging as future 3-D displays,” Proc. IEEE 105, 789–804 (2017).
[Crossref]

Lee, C.-K.

Lee, H.

J. Y. Son, H. Lee, B. R. Lee, and K. H. Lee, “Holographic and light-field imaging as future 3-D displays,” Proc. IEEE 105, 789–804 (2017).
[Crossref]

Lee, J.-H.

Lee, K. H.

J. Y. Son, H. Lee, B. R. Lee, and K. H. Lee, “Holographic and light-field imaging as future 3-D displays,” Proc. IEEE 105, 789–804 (2017).
[Crossref]

Levoy, M.

R. Ng, M. Levoy, M. Br’edif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a handheld plenoptic camera,” (2005).

M. Levoy and P. Hanrahan, “Light field rendering,” in 23rd Annual Conference on Computer Graphics and Interactive Techniques (ACM, 1996), pp. 31–42.

Li, G.

Y. Jeong, S. Moon, J. Jeong, G. Li, J. Cho, and B. Lee, “One shot 360-degree light field capture and reconstruction with depth extraction based on optical flow for light field camera,” Appl. Sci. 8, 890 (2018).
[Crossref]

Li, H.

X. Liu and H. Li, “The progress of light field 3-D displays,” Inf. Disp. 30, 6–14 (2014).

Lippmann, G.

G. Lippmann, “La photographie integrale,” C.R. Hebd. Seances Acad. Sci. 146, 446–451 (1908).

Liu, X.

X. Liu and H. Li, “The progress of light field 3-D displays,” Inf. Disp. 30, 6–14 (2014).

Lohmann, A. W.

Lumsdaine, A.

A. Lumsdaine and T. Georgiev, “The focused plenoptic camera,” in IEEE International Conference on Computational Photography (ICCP)(2009).

Martinez-Corral, M.

Martínez-Corral, M.

Mendlovic, D.

Moon, S.

Y. Jeong, S. Moon, J. Jeong, G. Li, J. Cho, and B. Lee, “One shot 360-degree light field capture and reconstruction with depth extraction based on optical flow for light field camera,” Appl. Sci. 8, 890 (2018).
[Crossref]

J. Kim, S. Moon, Y. Jeong, C. Jang, Y. Kim, and B. Lee, “Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display,” J. Biomed. Opt. 23, 066502 (2018).
[Crossref]

Moore, E. D.

B. Braker and E. D. Moore, “Catadioptric projector systems, devices, and methods,” U.S. patent application20170023780A1 (January1, 2017).

Nam, D.

Nayar, S. K.

G. Krishnan and S. K. Nayar, “Towards a true spherical camera,” Proc. SPIE 7240, 724002 (2009).
[Crossref]

Nelleri, A.

B. L. Reddy, M. P. Phinehas, and A. Nelleri, “Space bandwidth product analysis of digital holography and image reconstruction process,” in International Conference on Nextgen Electronic Technologies: Silicon to Software (ICNETS2), Chennai, India, March2017, pp. 194–197.

Ng, R.

R. Ng, M. Levoy, M. Br’edif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a handheld plenoptic camera,” (2005).

Nussbaum, P.

P. Nussbaum, R. Volkel, H. P. Herzig, M. Eisner, and S. Haselbeck, “Design, fabrication and testing of microlens arrays for sensors and microsystems,” Pure Appl. Opt. 6, 617–636 (1997).
[Crossref]

Okano, F.

Onural, L.

L. Onural, “Design of a 360-degree holographic 3D video display using commonly available display panels and a paraboloid mirror,” Proc. SPIE 10126, 101260I (2017).
[Crossref]

A. O. Yontem and L. Onural, “Integral imaging based 3D display of holographic data,” Opt. Express 20, 24175–24195 (2012).
[Crossref]

E. Sahin and L. Onural, “A comparative study of light field representation and integral imaging,” Imaging Sci. J. 58, 28–31 (2010).
[Crossref]

F. Yaras, H. Kang, and L. Onural, “Multi-SLM holographic display system with planar configuration,” in 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, Tampere Finland, June2010, pp. 1–4.

Park, D.-S.

Park, J.

Phinehas, M. P.

B. L. Reddy, M. P. Phinehas, and A. Nelleri, “Space bandwidth product analysis of digital holography and image reconstruction process,” in International Conference on Nextgen Electronic Technologies: Silicon to Software (ICNETS2), Chennai, India, March2017, pp. 194–197.

Rakkolainen, I.

P. Benzie, J. Watson, P. Surman, I. Rakkolainen, K. Hopf, H. Urey, V. Sainov, and C. von Kopylow, “A survey of 3DTV displays: techniques and technologies,” IEEE Trans. Circuits Syst. Video Technol. 17, 1647–1658 (2007).
[Crossref]

Raskar, R.

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting,” ACM Trans. Graph. 31, 80 (2012).
[Crossref]

Reddy, B. L.

B. L. Reddy, M. P. Phinehas, and A. Nelleri, “Space bandwidth product analysis of digital holography and image reconstruction process,” in International Conference on Nextgen Electronic Technologies: Silicon to Software (ICNETS2), Chennai, India, March2017, pp. 194–197.

Sahin, E.

E. Sahin and L. Onural, “A comparative study of light field representation and integral imaging,” Imaging Sci. J. 58, 28–31 (2010).
[Crossref]

Sainov, V.

P. Benzie, J. Watson, P. Surman, I. Rakkolainen, K. Hopf, H. Urey, V. Sainov, and C. von Kopylow, “A survey of 3DTV displays: techniques and technologies,” IEEE Trans. Circuits Syst. Video Technol. 17, 1647–1658 (2007).
[Crossref]

Saito, H.

H. Todoroki and H. Saito, “Light field rendering with omni-directional camera,” Proc. SPIE 5150, 1159–1168 (2003).
[Crossref]

Sando, Y.

Y. Sando, K. Satoh, T. Kitagawa, M. Kawamura, D. Barada, and T. Yatagai, “Super-wide viewing-zone holographic 3D display using a convex parabolic mirror,” Sci. Rep. 8, 11333 (2018).
[Crossref]

Sasai, M.

Satoh, K.

Y. Sando, K. Satoh, T. Kitagawa, M. Kawamura, D. Barada, and T. Yatagai, “Super-wide viewing-zone holographic 3D display using a convex parabolic mirror,” Sci. Rep. 8, 11333 (2018).
[Crossref]

Schmiedchen, M.

Son, J. Y.

J. Y. Son, H. Lee, B. R. Lee, and K. H. Lee, “Holographic and light-field imaging as future 3-D displays,” Proc. IEEE 105, 789–804 (2017).
[Crossref]

Spiegel, W.

Stern, A.

Surman, P.

P. Benzie, J. Watson, P. Surman, I. Rakkolainen, K. Hopf, H. Urey, V. Sainov, and C. von Kopylow, “A survey of 3DTV displays: techniques and technologies,” IEEE Trans. Circuits Syst. Video Technol. 17, 1647–1658 (2007).
[Crossref]

Takada, H.

Todoroki, H.

H. Todoroki and H. Saito, “Light field rendering with omni-directional camera,” Proc. SPIE 5150, 1159–1168 (2003).
[Crossref]

Tschudi, T.

Tumblin, J.

J. Tumblin, “Dappled photography: mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69 (2007).
[Crossref]

Urey, H.

P. Benzie, J. Watson, P. Surman, I. Rakkolainen, K. Hopf, H. Urey, V. Sainov, and C. von Kopylow, “A survey of 3DTV displays: techniques and technologies,” IEEE Trans. Circuits Syst. Video Technol. 17, 1647–1658 (2007).
[Crossref]

Volkel, R.

P. Nussbaum, R. Volkel, H. P. Herzig, M. Eisner, and S. Haselbeck, “Design, fabrication and testing of microlens arrays for sensors and microsystems,” Pure Appl. Opt. 6, 617–636 (1997).
[Crossref]

von Kopylow, C.

P. Benzie, J. Watson, P. Surman, I. Rakkolainen, K. Hopf, H. Urey, V. Sainov, and C. von Kopylow, “A survey of 3DTV displays: techniques and technologies,” IEEE Trans. Circuits Syst. Video Technol. 17, 1647–1658 (2007).
[Crossref]

Watson, J.

P. Benzie, J. Watson, P. Surman, I. Rakkolainen, K. Hopf, H. Urey, V. Sainov, and C. von Kopylow, “A survey of 3DTV displays: techniques and technologies,” IEEE Trans. Circuits Syst. Video Technol. 17, 1647–1658 (2007).
[Crossref]

Wetzstein, G.

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting,” ACM Trans. Graph. 31, 80 (2012).
[Crossref]

Xiao, X.

Yamaguchi, M.

Yaras, F.

F. Yaras, H. Kang, and L. Onural, “Multi-SLM holographic display system with planar configuration,” in 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, Tampere Finland, June2010, pp. 1–4.

Yatagai, T.

Y. Sando, K. Satoh, T. Kitagawa, M. Kawamura, D. Barada, and T. Yatagai, “Super-wide viewing-zone holographic 3D display using a convex parabolic mirror,” Sci. Rep. 8, 11333 (2018).
[Crossref]

Yeom, J.

Yontem, A. O.

A. O. Yontem and L. Onural, “Integral imaging based 3D display of holographic data,” Opt. Express 20, 24175–24195 (2012).
[Crossref]

A. O. Yontem, “Three-dimensional integral imaging based capture and display system using digital programmable Fresnel lenslet arrays,” Ph.D. thesis (Bilkent University, 2012).

A. O. Yontem and D. Chu, “Design of micro photon sieve arrays for high resolution light-field capture in plenoptic cameras,” in 3DTV-Conference: The True Vision—Capture, Transmission and Display of 3D Video (3DTV-CON) (2018).

Yöntem, A. O.

A. O. Yöntem and D. Chu, “Design for 360-degree 3D light-field camera and display,” in Imaging and Applied Optics (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2018), paper 3Tu5G.6.

Young Choi, S.

Yuyama, I.

ACM Trans. Graph. (2)

J. Tumblin, “Dappled photography: mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69 (2007).
[Crossref]

G. Wetzstein, D. Lanman, M. Hirsch, and R. Raskar, “Tensor displays: compressive light field synthesis using multilayer displays with directional backlighting,” ACM Trans. Graph. 31, 80 (2012).
[Crossref]

Adv. Opt. Photon. (1)

Appl. Opt. (4)

Appl. Sci. (1)

Y. Jeong, S. Moon, J. Jeong, G. Li, J. Cho, and B. Lee, “One shot 360-degree light field capture and reconstruction with depth extraction based on optical flow for light field camera,” Appl. Sci. 8, 890 (2018).
[Crossref]

C.R. Hebd. Seances Acad. Sci. (1)

G. Lippmann, “La photographie integrale,” C.R. Hebd. Seances Acad. Sci. 146, 446–451 (1908).

IEEE Trans. Circuits Syst. Video Technol. (1)

P. Benzie, J. Watson, P. Surman, I. Rakkolainen, K. Hopf, H. Urey, V. Sainov, and C. von Kopylow, “A survey of 3DTV displays: techniques and technologies,” IEEE Trans. Circuits Syst. Video Technol. 17, 1647–1658 (2007).
[Crossref]

Imaging Sci. J. (1)

E. Sahin and L. Onural, “A comparative study of light field representation and integral imaging,” Imaging Sci. J. 58, 28–31 (2010).
[Crossref]

Inf. Disp. (1)

X. Liu and H. Li, “The progress of light field 3-D displays,” Inf. Disp. 30, 6–14 (2014).

J. Biomed. Opt. (1)

J. Kim, S. Moon, Y. Jeong, C. Jang, Y. Kim, and B. Lee, “Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display,” J. Biomed. Opt. 23, 066502 (2018).
[Crossref]

J. Opt. Soc. Am. A (3)

Nature (1)

D. Gabor, “A new microscopic principle,” Nature 161, 777–778 (1948).
[Crossref]

Opt. Express (3)

Opt. Lett. (1)

Proc. IEEE (1)

J. Y. Son, H. Lee, B. R. Lee, and K. H. Lee, “Holographic and light-field imaging as future 3-D displays,” Proc. IEEE 105, 789–804 (2017).
[Crossref]

Proc. SPIE (4)

H. Todoroki and H. Saito, “Light field rendering with omni-directional camera,” Proc. SPIE 5150, 1159–1168 (2003).
[Crossref]

G. Krishnan and S. K. Nayar, “Towards a true spherical camera,” Proc. SPIE 7240, 724002 (2009).
[Crossref]

L. Onural, “Design of a 360-degree holographic 3D video display using commonly available display panels and a paraboloid mirror,” Proc. SPIE 10126, 101260I (2017).
[Crossref]

R. Bregovic, P. T. Kovács, T. Balogh, and A. Gotchev, “Display-specific light-field analysis,” Proc. SPIE 9117, 911710 (2009).
[Crossref]

Pure Appl. Opt. (1)

P. Nussbaum, R. Volkel, H. P. Herzig, M. Eisner, and S. Haselbeck, “Design, fabrication and testing of microlens arrays for sensors and microsystems,” Pure Appl. Opt. 6, 617–636 (1997).
[Crossref]

Sci. Rep. (1)

Y. Sando, K. Satoh, T. Kitagawa, M. Kawamura, D. Barada, and T. Yatagai, “Super-wide viewing-zone holographic 3D display using a convex parabolic mirror,” Sci. Rep. 8, 11333 (2018).
[Crossref]

Other (12)

B. Braker and E. D. Moore, “Catadioptric projector systems, devices, and methods,” U.S. patent application20170023780A1 (January1, 2017).

M. Levoy and P. Hanrahan, “Light field rendering,” in 23rd Annual Conference on Computer Graphics and Interactive Techniques (ACM, 1996), pp. 31–42.

Filmora, 2018, https://filmora.wondershare.com/virtual-reality/top-10-professional-360-degree-cameras.html .

N. Balram, “Fundamentals of light field imaging and display systems,” in Display Week (SID, 2016).

A. Lumsdaine and T. Georgiev, “The focused plenoptic camera,” in IEEE International Conference on Computational Photography (ICCP)(2009).

R. Ng, M. Levoy, M. Br’edif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a handheld plenoptic camera,” (2005).

A. O. Yontem and D. Chu, “Design of micro photon sieve arrays for high resolution light-field capture in plenoptic cameras,” in 3DTV-Conference: The True Vision—Capture, Transmission and Display of 3D Video (3DTV-CON) (2018).

B. L. Reddy, M. P. Phinehas, and A. Nelleri, “Space bandwidth product analysis of digital holography and image reconstruction process,” in International Conference on Nextgen Electronic Technologies: Silicon to Software (ICNETS2), Chennai, India, March2017, pp. 194–197.

F. Yaras, H. Kang, and L. Onural, “Multi-SLM holographic display system with planar configuration,” in 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, Tampere Finland, June2010, pp. 1–4.

A. O. Yontem, “Three-dimensional integral imaging based capture and display system using digital programmable Fresnel lenslet arrays,” Ph.D. thesis (Bilkent University, 2012).

Apotelyth, 2018, https://www.apotelyt.com/compare-camera/sony-a7r-ii-vs-sony-a850 .

A. O. Yöntem and D. Chu, “Design for 360-degree 3D light-field camera and display,” in Imaging and Applied Optics (3D, AO, AIO, COSI, DH, IS, LACSEA, LS&C, MATH, pcAOP), OSA Technical Digest (Optical Society of America, 2018), paper 3Tu5G.6.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (15)

Fig. 1.
Fig. 1. Generic light-field data recording and display method using microlens arrays. Arrows toward (s,t) plane represent recording, and away from (s,t) plane represent display parts.
Fig. 2.
Fig. 2. Generic light-field system for 3D image (a) acquisition and (b) display based on focused plenoptic camera configuration.
Fig. 3.
Fig. 3. Reciprocal 3D light-field acquisition and display system design with common optics. (a) Capturing. (b) Displaying. Arrows indicate the direction where the light rays travel. Left side of each sub-figure is assumed to have the catadioptric optics.
Fig. 4.
Fig. 4. (a) Proposed design for a 360-deg 3D light-field system in display mode. The 3D images are displayed in a spherical volume rather than a rectangular volume. (b) Light-field camera and (c) display of the proposed design.
Fig. 5.
Fig. 5. Schematics of a light-field capture system are shown (a) without a mirror, (b) with a (half) conical mirror, and (c) with a (half) parabolic mirror.
Fig. 6.
Fig. 6. (a) Schematics of the capture system with a parabolic mirror, a field lens, and a lens array. (b) Simulated ray-tracing results on the detector. (c) Chart with the minimum resolvable pixel size against the object distance.
Fig. 7.
Fig. 7. Simulated image on the detector for the optical setup shown in Fig. 9(b).
Fig. 8.
Fig. 8. Sets of sub-images with different colors and distances are generated using wave propagation.
Fig. 9.
Fig. 9. (a) Schematics of the displaying part with the sub-images and lens array, with simulation results (insets) of the integrated image at the intermediate plane and the viewer’s plane. (b) Display setup. A 4K mobile phone display (Sony Experia XZ) is used to display 2D sub-images. A lens array, fabricated by vacuum casting with 10 × 18 elements and 32 mm focal length each, matching the displayed sub-images, is used to reconstruct the 3D images. A stack of Fresnel lenses with a focal length and diameter of 300 mm is used to obtain a low f-number equivalent lens. A parabolic mirror, used to mitigate the blind spots in traffic, with 300 mm diameter is placed after the Fresnel lens stack.
Fig. 10.
Fig. 10. (a) Schematic of the capturing setup. (b) Physical capturing setup.
Fig. 11.
Fig. 11. Captured (left) and processed (right) sub-image sets. Insets on the left and in the middle show the portions from corresponding images. The one on the right shows after red-eye removal applied. Although we have recorded the images using green color filter, we obtained full color images after the processing steps. This is due to the poor spatial filtering of the filter used over the flash light source.
Fig. 12.
Fig. 12. Single sub-images cropped from the same location of the recorded images with lens aperture sizes of (a) f/1.8, (b) f/4, and (c) f/11.
Fig. 13.
Fig. 13. Reconstructed real images after light rays are reflected from the mirror surface. (a) Focus on the letter “B” and (b) focus on letter “A” when a narrow angle (15° Luminit) holographic diffuser is placed at the intermediate imaging planes. (c) Observed reconstructed images reflected from the parabolic mirror surface without any diffuser in the intermediate imaging plane. (d) Observed reconstructions realized on the diffuser, which is placed at the reconstruction distance after the rays are reflected from the mirror surface.
Fig. 14.
Fig. 14. Reconstruction of the captured image “F” realized on a diffuser after it is reflected from the parabolic mirror surface. (left) Full color. (right) Green color only.
Fig. 15.
Fig. 15. Display pixel configuration. From left to right, full color and respective single channel colors isolated from the full color image are shown. The full color image is shot under a 60× microscope objective. The dashed boxes represent a single pixel composed of sub-pixels of each individual color channel. The blue channel leaks through the green pixels. This is visible when the picture is zoomed in.

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

θ x = 2 × a tan ( D / 2 z i ) and δ = θ x / M x .
SBP = 2 × a tan ( D / 2 z i ) M x × 1.22 × f 1 λ d × N x .

Metrics