Abstract

We propose an integrated 3D display and imaging system using a head-mounted device and a special dual-purpose passive screen that can simultaneously facilitate 3D display and imaging. The screen is mainly composed of two optical layers, the first layer is a projection surface, which are the finely patterned retro-reflective microspheres that provide high optical gain when illuminated with head-mounted projectors. The second layer is an imaging surface made up of an array of curved mirrors, which form the perspective views of the scene captured by a head-mounted camera. The display and imaging operation are separated by performing polarization multiplexing. The demonstrated prototype system consists of a head-worn unit having a pair of 15 lumen pico-projectors and a 24MP camera, and an in-house designed and fabricated 30cm × 24cm screen. The screen provides bright display using 25% filled retro-reflective microspheres and 20 different perspective views of the user/scene using 5 × 4 array of convex mirrors. The real-time implementation is demonstrated by displaying stereo-3D content providing high brightness (up to 240 cd/m2) and low crosstalk (<4%), while 3D image capture is demonstrated by performing the computational reconstruction of the discrete free-viewpoint stereo pair displayed on a desktop or virtual reality display. Furthermore, the capture quality is determined by measuring the imaging MTF of the captured views and the capture light efficiency is calculated by considering the loss in transmitted light at each interface. Further developments in microfabrication and computational optics can present the proposed system as a unique mobile platform for immersive human-computer interaction of the future.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Full Article  |  PDF Article
OSA Recommended Articles
Light-efficient augmented reality 3D display using highly transparent retro-reflective screen

Shoaib R. Soomro and Hakan Urey
Appl. Opt. 56(22) 6108-6113 (2017)

Investigation of a 3D head-mounted projection display using retro-reflective screen

Dalma Héricz, Tamás Sarkadi, Viktor Lucza, Viktor Kovács, and Pál Koppa
Opt. Express 22(15) 17823-17829 (2014)

Design, fabrication and characterization of transparent retro-reflective screen

Shoaib R Soomro and Hakan Urey
Opt. Express 24(21) 24232-24241 (2016)

References

  • View by:
  • |
  • |
  • |

  1. J. P. Rolland, K. P. Thompson, H. Urey, and M. Thomas, “See-Through Head Worn Display (HWD) Architectures,” in Handbook of Visual Display Technology, J. Chen, W. Cranton, and M. Fihn, eds. (Springer Berlin Heidelberg, 2012), pp. 2145–2170.
  2. B. Kress and T. Starner, “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. SPIE 8720, 87200A (2013).
    [Crossref]
  3. H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22(11), 13484–13491 (2014).
    [Crossref] [PubMed]
  4. M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
    [Crossref]
  5. K. Hong, J. Yeom, C. Jang, J. Hong, and B. Lee, “Full-color lens-array holographic optical element for three-dimensional optical see-through augmented reality,” Opt. Lett. 39(1), 127–130 (2014).
    [Crossref] [PubMed]
  6. D. Lanman and D. Luebke, “Near-eye light field displays,” ACM SIGGRAPH Emerg. Technol. 32(6), 1–10 (2013).
    [Crossref]
  7. D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
    [Crossref] [PubMed]
  8. A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36(4), 1–16 (2017).
    [Crossref]
  9. S. T. S. Holmström, U. Baran, and H. Urey, “MEMS laser scanners: A review,” J. Microelectromech. Syst. 23(2), 259–275 (2014).
    [Crossref]
  10. D. M. Krum, E. A. Suma, and M. Bolas, “Augmented reality using personal projection and retroreflection,” Pers. Ubiquitous Comput. 16(1), 17–26 (2012).
    [Crossref]
  11. D. Kade, K. Akşit, H. Ürey, and O. Özcan, “Head-mounted mixed reality projection display for games production and entertainment,” Pers. Ubiquitous Comput. 19(3-4), 509–521 (2015).
    [Crossref]
  12. S. Tachi, “Telexistence and Retro-reflective Projection Technology (RPT),” in Proceedings of Virtual Reality International Conference(VRIC) (2003), p. 69/1–69/9.
  13. D. Héricz, T. Sarkadi, V. Lucza, V. Kovács, and P. Koppa, “Investigation of a 3D head-mounted projection display using retro-reflective screen,” Opt. Express 22(15), 17823–17829 (2014).
    [Crossref] [PubMed]
  14. S. R. Soomro and H. Urey, “Design, fabrication and characterization of transparent retro-reflective screen,” Opt. Express 24(21), 24232–24241 (2016).
    [Crossref] [PubMed]
  15. S. R. Soomro and H. Urey, “Light-efficient augmented reality 3D display using highly transparent retro-reflective screen,” Appl. Opt. 56(22), 6108–6113 (2017).
    [Crossref] [PubMed]
  16. A. Gershun, “The Light Field,” J. Math. Phys. 18(1-4), 51–151 (1939).
    [Crossref]
  17. M. Levo and P. Hanrahan, “Light Field Rendering,” in Siggraph (1996), pp. 31–42.
  18. B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
    [Crossref]
  19. J. Yang, M. Everett, C. Buehler, and L. McMillan, “A real-time distributed light field camera,” in Eurographics Assoc. (2002), pp. 1–10.
  20. T. N. Taguchi, K. Takahashi, and T. Naemura, “Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array,” IEICE Trans. Inf. Syst. 92(7), 1442–1452 (2009).
    [Crossref]
  21. M. Zhang, Y. Piao, N. W. Kim, and E. S. Kim, “Distortion-free wide-angle 3D imaging and visualization using off-axially distributed image sensing,” Opt. Lett. 39(14), 4212–4214 (2014).
    [Crossref] [PubMed]
  22. R. Schulein, M. DaneshPanah, and B. Javidi, “3D imaging with axially distributed sensing,” Opt. Lett. 34(13), 2012–2014 (2009).
    [Crossref] [PubMed]
  23. Y. Taguchi, A. Agrawal, A. Veeraraghavan, S. Ramalingam, and R. Raskar, “Axial-cones: modeling spherical catadioptric cameras for wide-angle light field rendering,” ACM Trans. Graph. (Proceedings SIGGRAPH Asia 2010) 29, 172 (2010).
    [Crossref]
  24. W. Song, Y. Liu, W. Li, and Y. Wang, “Light field acquisition using a planar catadioptric system,” Opt. Express 23(24), 31126–31135 (2015).
    [Crossref] [PubMed]
  25. M. Fuchs, M. Kächele, and S. Rusinkiewicz, “Design and fabrication of faceted mirror arrays for light field capture,” Comput. Graph. Forum 32(8), 246–257 (2013).
    [Crossref]
  26. J. S. Jang and B. Javidi, “Three-dimensional synthetic aperture integral imaging,” Opt. Lett. 27(13), 1144–1146 (2002).
    [Crossref] [PubMed]
  27. J. Kim, J. H. Jung, C. Jang, and B. Lee, “Real-time capturing and 3D visualization method based on integral imaging,” Opt. Express 21(16), 18742–18753 (2013).
    [Crossref] [PubMed]
  28. A. Stern and B. Javidi, “3-D computational synthetic aperture integral imaging (COMPSAII),” Opt. Express 11(19), 2446–2451 (2003).
    [Crossref] [PubMed]
  29. W. Matusik and H. Pfister, “3D TV: A Scalable System for Real-Time Acquisition, Transmission, and Autostereoscopic Display of Dynamic Scenes,” ACM SIGGRAPH 23, 814 (2004).
    [Crossref]
  30. A. Maimone, J. Bidwell, K. Peng, and H. Fuchs, “Enhanced personal autostereoscopic telepresence system using commodity depth cameras,” Comput. Graph. 36(7), 791–807 (2012).
    [Crossref]
  31. M. Dou, Y. Shi, J. M. Frahm, H. Fuchs, B. Mauchly, and M. Marathe, “Room-sized informal telepresence system,” in Proceedings - IEEE Virtual Reality (2012), pp. 15–18.
  32. “Holoportation by Microsoft,” https://www.microsoft.com/en-us/research/project/holoportation-3/ .
  33. “High Contrast Linear Polarizer,” http://www.polarization.com/polarshop/ .
  34. S. R. Soomro, E. Ulusoy, M. Eralp, and H. Urey, “Dual purpose passive screen for simultaneous display and imaging,” Proc. SPIE 10126, 101260N (2017).
    [Crossref]
  35. “Microvision Pico Projectors ShowWX+,” http://www.microvision.com/product-support/showwx/ .
  36. M. Estribeau and P. Magnan, “Fast MTF measurement of CMOS imagers using ISO 12333 slanted-edge methodology,” Proc. SPIE 5251, 243–252 (2004).
    [Crossref]
  37. J. Canny, “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell. 8(6), 679–698 (1986).
    [Crossref] [PubMed]

2017 (4)

D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
[Crossref] [PubMed]

A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36(4), 1–16 (2017).
[Crossref]

S. R. Soomro, E. Ulusoy, M. Eralp, and H. Urey, “Dual purpose passive screen for simultaneous display and imaging,” Proc. SPIE 10126, 101260N (2017).
[Crossref]

S. R. Soomro and H. Urey, “Light-efficient augmented reality 3D display using highly transparent retro-reflective screen,” Appl. Opt. 56(22), 6108–6113 (2017).
[Crossref] [PubMed]

2016 (1)

2015 (2)

W. Song, Y. Liu, W. Li, and Y. Wang, “Light field acquisition using a planar catadioptric system,” Opt. Express 23(24), 31126–31135 (2015).
[Crossref] [PubMed]

D. Kade, K. Akşit, H. Ürey, and O. Özcan, “Head-mounted mixed reality projection display for games production and entertainment,” Pers. Ubiquitous Comput. 19(3-4), 509–521 (2015).
[Crossref]

2014 (5)

2013 (5)

J. Kim, J. H. Jung, C. Jang, and B. Lee, “Real-time capturing and 3D visualization method based on integral imaging,” Opt. Express 21(16), 18742–18753 (2013).
[Crossref] [PubMed]

B. Kress and T. Starner, “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. SPIE 8720, 87200A (2013).
[Crossref]

M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
[Crossref]

D. Lanman and D. Luebke, “Near-eye light field displays,” ACM SIGGRAPH Emerg. Technol. 32(6), 1–10 (2013).
[Crossref]

M. Fuchs, M. Kächele, and S. Rusinkiewicz, “Design and fabrication of faceted mirror arrays for light field capture,” Comput. Graph. Forum 32(8), 246–257 (2013).
[Crossref]

2012 (2)

A. Maimone, J. Bidwell, K. Peng, and H. Fuchs, “Enhanced personal autostereoscopic telepresence system using commodity depth cameras,” Comput. Graph. 36(7), 791–807 (2012).
[Crossref]

D. M. Krum, E. A. Suma, and M. Bolas, “Augmented reality using personal projection and retroreflection,” Pers. Ubiquitous Comput. 16(1), 17–26 (2012).
[Crossref]

2009 (2)

T. N. Taguchi, K. Takahashi, and T. Naemura, “Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array,” IEICE Trans. Inf. Syst. 92(7), 1442–1452 (2009).
[Crossref]

R. Schulein, M. DaneshPanah, and B. Javidi, “3D imaging with axially distributed sensing,” Opt. Lett. 34(13), 2012–2014 (2009).
[Crossref] [PubMed]

2005 (1)

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
[Crossref]

2004 (2)

W. Matusik and H. Pfister, “3D TV: A Scalable System for Real-Time Acquisition, Transmission, and Autostereoscopic Display of Dynamic Scenes,” ACM SIGGRAPH 23, 814 (2004).
[Crossref]

M. Estribeau and P. Magnan, “Fast MTF measurement of CMOS imagers using ISO 12333 slanted-edge methodology,” Proc. SPIE 5251, 243–252 (2004).
[Crossref]

2003 (1)

2002 (1)

1986 (1)

J. Canny, “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell. 8(6), 679–698 (1986).
[Crossref] [PubMed]

1939 (1)

A. Gershun, “The Light Field,” J. Math. Phys. 18(1-4), 51–151 (1939).
[Crossref]

Adams, A.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
[Crossref]

Aksit, K.

D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
[Crossref] [PubMed]

D. Kade, K. Akşit, H. Ürey, and O. Özcan, “Head-mounted mixed reality projection display for games production and entertainment,” Pers. Ubiquitous Comput. 19(3-4), 509–521 (2015).
[Crossref]

Antunez, E.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
[Crossref]

Baran, U.

S. T. S. Holmström, U. Baran, and H. Urey, “MEMS laser scanners: A review,” J. Microelectromech. Syst. 23(2), 259–275 (2014).
[Crossref]

Barth, A.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
[Crossref]

Bidwell, J.

A. Maimone, J. Bidwell, K. Peng, and H. Fuchs, “Enhanced personal autostereoscopic telepresence system using commodity depth cameras,” Comput. Graph. 36(7), 791–807 (2012).
[Crossref]

Bolas, M.

D. M. Krum, E. A. Suma, and M. Bolas, “Augmented reality using personal projection and retroreflection,” Pers. Ubiquitous Comput. 16(1), 17–26 (2012).
[Crossref]

Brugger, J.

M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
[Crossref]

Cadarso, V. J.

M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
[Crossref]

Canny, J.

J. Canny, “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell. 8(6), 679–698 (1986).
[Crossref] [PubMed]

DaneshPanah, M.

Didyk, P.

D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
[Crossref] [PubMed]

Dunn, D.

D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
[Crossref] [PubMed]

Eralp, M.

S. R. Soomro, E. Ulusoy, M. Eralp, and H. Urey, “Dual purpose passive screen for simultaneous display and imaging,” Proc. SPIE 10126, 101260N (2017).
[Crossref]

Estribeau, M.

M. Estribeau and P. Magnan, “Fast MTF measurement of CMOS imagers using ISO 12333 slanted-edge methodology,” Proc. SPIE 5251, 243–252 (2004).
[Crossref]

Fuchs, H.

D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
[Crossref] [PubMed]

A. Maimone, J. Bidwell, K. Peng, and H. Fuchs, “Enhanced personal autostereoscopic telepresence system using commodity depth cameras,” Comput. Graph. 36(7), 791–807 (2012).
[Crossref]

Fuchs, M.

M. Fuchs, M. Kächele, and S. Rusinkiewicz, “Design and fabrication of faceted mirror arrays for light field capture,” Comput. Graph. Forum 32(8), 246–257 (2013).
[Crossref]

Georgiou, A.

A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36(4), 1–16 (2017).
[Crossref]

Gershun, A.

A. Gershun, “The Light Field,” J. Math. Phys. 18(1-4), 51–151 (1939).
[Crossref]

Grossenbacher, J.

M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
[Crossref]

Guillaumée, M.

M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
[Crossref]

Héricz, D.

Holmström, S. T. S.

S. T. S. Holmström, U. Baran, and H. Urey, “MEMS laser scanners: A review,” J. Microelectromech. Syst. 23(2), 259–275 (2014).
[Crossref]

Hong, J.

Hong, K.

Horowitz, M.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
[Crossref]

Hua, H.

Jang, C.

Jang, J. S.

Javidi, B.

Joshi, N.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
[Crossref]

Jung, J. H.

Kächele, M.

M. Fuchs, M. Kächele, and S. Rusinkiewicz, “Design and fabrication of faceted mirror arrays for light field capture,” Comput. Graph. Forum 32(8), 246–257 (2013).
[Crossref]

Kade, D.

D. Kade, K. Akşit, H. Ürey, and O. Özcan, “Head-mounted mixed reality projection display for games production and entertainment,” Pers. Ubiquitous Comput. 19(3-4), 509–521 (2015).
[Crossref]

Kellnhofer, P.

D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
[Crossref] [PubMed]

Kim, E. S.

Kim, J.

Kim, N. W.

Kollin, J. S.

A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36(4), 1–16 (2017).
[Crossref]

Koppa, P.

Kovács, V.

Kress, B.

B. Kress and T. Starner, “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. SPIE 8720, 87200A (2013).
[Crossref]

Krum, D. M.

D. M. Krum, E. A. Suma, and M. Bolas, “Augmented reality using personal projection and retroreflection,” Pers. Ubiquitous Comput. 16(1), 17–26 (2012).
[Crossref]

Lanman, D.

D. Lanman and D. Luebke, “Near-eye light field displays,” ACM SIGGRAPH Emerg. Technol. 32(6), 1–10 (2013).
[Crossref]

Lee, B.

Levoy, M.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
[Crossref]

Li, W.

Liu, Y.

Lucza, V.

Luebke, D.

D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
[Crossref] [PubMed]

D. Lanman and D. Luebke, “Near-eye light field displays,” ACM SIGGRAPH Emerg. Technol. 32(6), 1–10 (2013).
[Crossref]

Mader, A.

M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
[Crossref]

Magnan, P.

M. Estribeau and P. Magnan, “Fast MTF measurement of CMOS imagers using ISO 12333 slanted-edge methodology,” Proc. SPIE 5251, 243–252 (2004).
[Crossref]

Maimone, A.

A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36(4), 1–16 (2017).
[Crossref]

A. Maimone, J. Bidwell, K. Peng, and H. Fuchs, “Enhanced personal autostereoscopic telepresence system using commodity depth cameras,” Comput. Graph. 36(7), 791–807 (2012).
[Crossref]

Matusik, W.

W. Matusik and H. Pfister, “3D TV: A Scalable System for Real-Time Acquisition, Transmission, and Autostereoscopic Display of Dynamic Scenes,” ACM SIGGRAPH 23, 814 (2004).
[Crossref]

Moser, C.

M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
[Crossref]

Myszkowski, K.

D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
[Crossref] [PubMed]

Naemura, T.

T. N. Taguchi, K. Takahashi, and T. Naemura, “Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array,” IEICE Trans. Inf. Syst. 92(7), 1442–1452 (2009).
[Crossref]

Özcan, O.

D. Kade, K. Akşit, H. Ürey, and O. Özcan, “Head-mounted mixed reality projection display for games production and entertainment,” Pers. Ubiquitous Comput. 19(3-4), 509–521 (2015).
[Crossref]

Peng, K.

A. Maimone, J. Bidwell, K. Peng, and H. Fuchs, “Enhanced personal autostereoscopic telepresence system using commodity depth cameras,” Comput. Graph. 36(7), 791–807 (2012).
[Crossref]

Pfister, H.

W. Matusik and H. Pfister, “3D TV: A Scalable System for Real-Time Acquisition, Transmission, and Autostereoscopic Display of Dynamic Scenes,” ACM SIGGRAPH 23, 814 (2004).
[Crossref]

Piao, Y.

Rusinkiewicz, S.

M. Fuchs, M. Kächele, and S. Rusinkiewicz, “Design and fabrication of faceted mirror arrays for light field capture,” Comput. Graph. Forum 32(8), 246–257 (2013).
[Crossref]

Sarkadi, T.

Schulein, R.

Song, W.

Soomro, S. R.

Sprague, R.

M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
[Crossref]

Starner, T.

B. Kress and T. Starner, “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. SPIE 8720, 87200A (2013).
[Crossref]

Stern, A.

Suma, E. A.

D. M. Krum, E. A. Suma, and M. Bolas, “Augmented reality using personal projection and retroreflection,” Pers. Ubiquitous Comput. 16(1), 17–26 (2012).
[Crossref]

Taguchi, T. N.

T. N. Taguchi, K. Takahashi, and T. Naemura, “Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array,” IEICE Trans. Inf. Syst. 92(7), 1442–1452 (2009).
[Crossref]

Takahashi, K.

T. N. Taguchi, K. Takahashi, and T. Naemura, “Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array,” IEICE Trans. Inf. Syst. 92(7), 1442–1452 (2009).
[Crossref]

Talvala, E.-V.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
[Crossref]

Tippets, C.

D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
[Crossref] [PubMed]

Torell, K.

D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
[Crossref] [PubMed]

Tremblay, E.

M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
[Crossref]

Ulusoy, E.

S. R. Soomro, E. Ulusoy, M. Eralp, and H. Urey, “Dual purpose passive screen for simultaneous display and imaging,” Proc. SPIE 10126, 101260N (2017).
[Crossref]

Urey, H.

S. R. Soomro, E. Ulusoy, M. Eralp, and H. Urey, “Dual purpose passive screen for simultaneous display and imaging,” Proc. SPIE 10126, 101260N (2017).
[Crossref]

S. R. Soomro and H. Urey, “Light-efficient augmented reality 3D display using highly transparent retro-reflective screen,” Appl. Opt. 56(22), 6108–6113 (2017).
[Crossref] [PubMed]

S. R. Soomro and H. Urey, “Design, fabrication and characterization of transparent retro-reflective screen,” Opt. Express 24(21), 24232–24241 (2016).
[Crossref] [PubMed]

S. T. S. Holmström, U. Baran, and H. Urey, “MEMS laser scanners: A review,” J. Microelectromech. Syst. 23(2), 259–275 (2014).
[Crossref]

Ürey, H.

D. Kade, K. Akşit, H. Ürey, and O. Özcan, “Head-mounted mixed reality projection display for games production and entertainment,” Pers. Ubiquitous Comput. 19(3-4), 509–521 (2015).
[Crossref]

Vahdati, S. P.

M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
[Crossref]

Vaish, V.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
[Crossref]

Wang, Y.

Wilburn, B.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
[Crossref]

Yeom, J.

Zhang, M.

ACM SIGGRAPH (1)

W. Matusik and H. Pfister, “3D TV: A Scalable System for Real-Time Acquisition, Transmission, and Autostereoscopic Display of Dynamic Scenes,” ACM SIGGRAPH 23, 814 (2004).
[Crossref]

ACM SIGGRAPH Emerg. Technol. (1)

D. Lanman and D. Luebke, “Near-eye light field displays,” ACM SIGGRAPH Emerg. Technol. 32(6), 1–10 (2013).
[Crossref]

ACM Trans. Graph. (2)

A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36(4), 1–16 (2017).
[Crossref]

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24(3), 765 (2005).
[Crossref]

Appl. Opt. (1)

Comput. Graph. (1)

A. Maimone, J. Bidwell, K. Peng, and H. Fuchs, “Enhanced personal autostereoscopic telepresence system using commodity depth cameras,” Comput. Graph. 36(7), 791–807 (2012).
[Crossref]

Comput. Graph. Forum (1)

M. Fuchs, M. Kächele, and S. Rusinkiewicz, “Design and fabrication of faceted mirror arrays for light field capture,” Comput. Graph. Forum 32(8), 246–257 (2013).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (1)

J. Canny, “A computational approach to edge detection,” IEEE Trans. Pattern Anal. Mach. Intell. 8(6), 679–698 (1986).
[Crossref] [PubMed]

IEEE Trans. Vis. Comput. Graph. (1)

D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Aksit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs, “Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors,” IEEE Trans. Vis. Comput. Graph. 23(4), 1322–1331 (2017).
[Crossref] [PubMed]

IEICE Trans. Inf. Syst. (1)

T. N. Taguchi, K. Takahashi, and T. Naemura, “Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array Design and Implementation of a Real-Time Video-Based Rendering System Using a Network Camera Array,” IEICE Trans. Inf. Syst. 92(7), 1442–1452 (2009).
[Crossref]

J. Math. Phys. (1)

A. Gershun, “The Light Field,” J. Math. Phys. 18(1-4), 51–151 (1939).
[Crossref]

J. Microelectromech. Syst. (1)

S. T. S. Holmström, U. Baran, and H. Urey, “MEMS laser scanners: A review,” J. Microelectromech. Syst. 23(2), 259–275 (2014).
[Crossref]

Opt. Express (6)

Opt. Lett. (4)

Pers. Ubiquitous Comput. (2)

D. M. Krum, E. A. Suma, and M. Bolas, “Augmented reality using personal projection and retroreflection,” Pers. Ubiquitous Comput. 16(1), 17–26 (2012).
[Crossref]

D. Kade, K. Akşit, H. Ürey, and O. Özcan, “Head-mounted mixed reality projection display for games production and entertainment,” Pers. Ubiquitous Comput. 19(3-4), 509–521 (2015).
[Crossref]

Proc. SPIE (4)

B. Kress and T. Starner, “A review of head-mounted displays (HMD) technologies and applications for consumer electronics,” Proc. SPIE 8720, 87200A (2013).
[Crossref]

M. Guillaumée, S. P. Vahdati, E. Tremblay, A. Mader, V. J. Cadarso, J. Grossenbacher, J. Brugger, R. Sprague, and C. Moser, “Curved transflective holographic screens for head-mounted display,” Proc. SPIE 8643, 864306 (2013).
[Crossref]

M. Estribeau and P. Magnan, “Fast MTF measurement of CMOS imagers using ISO 12333 slanted-edge methodology,” Proc. SPIE 5251, 243–252 (2004).
[Crossref]

S. R. Soomro, E. Ulusoy, M. Eralp, and H. Urey, “Dual purpose passive screen for simultaneous display and imaging,” Proc. SPIE 10126, 101260N (2017).
[Crossref]

Other (9)

“Microvision Pico Projectors ShowWX+,” http://www.microvision.com/product-support/showwx/ .

Y. Taguchi, A. Agrawal, A. Veeraraghavan, S. Ramalingam, and R. Raskar, “Axial-cones: modeling spherical catadioptric cameras for wide-angle light field rendering,” ACM Trans. Graph. (Proceedings SIGGRAPH Asia 2010) 29, 172 (2010).
[Crossref]

S. Tachi, “Telexistence and Retro-reflective Projection Technology (RPT),” in Proceedings of Virtual Reality International Conference(VRIC) (2003), p. 69/1–69/9.

J. P. Rolland, K. P. Thompson, H. Urey, and M. Thomas, “See-Through Head Worn Display (HWD) Architectures,” in Handbook of Visual Display Technology, J. Chen, W. Cranton, and M. Fihn, eds. (Springer Berlin Heidelberg, 2012), pp. 2145–2170.

M. Levo and P. Hanrahan, “Light Field Rendering,” in Siggraph (1996), pp. 31–42.

J. Yang, M. Everett, C. Buehler, and L. McMillan, “A real-time distributed light field camera,” in Eurographics Assoc. (2002), pp. 1–10.

M. Dou, Y. Shi, J. M. Frahm, H. Fuchs, B. Mauchly, and M. Marathe, “Room-sized informal telepresence system,” in Proceedings - IEEE Virtual Reality (2012), pp. 15–18.

“Holoportation by Microsoft,” https://www.microsoft.com/en-us/research/project/holoportation-3/ .

“High Contrast Linear Polarizer,” http://www.polarization.com/polarshop/ .

Supplementary Material (2)

NameDescription
» Visualization 1       Demonstration of simultanous 3D display and imaging using dual-purpose passive screen.
» Visualization 2       Demonstration combined 3D display and imaging

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 Conceptual illustration showing the working principle of integrated 3D display and imaging system for 3D telepresence application. The stereo-3D views are displayed on the screen using head-mounted projectors, while the perspective views of the user are recorded using head-mounted camera at the same time.
Fig. 2
Fig. 2 (a) Fabrication of the dual-purpose screen showing the arrangement of different layers of the screen, (b) shows the images of the developed handheld screen, (c), and (d) show the closeups of a mirror in the screen when camera is focused on retro-reflective pattern and reflected image respectively.
Fig. 3
Fig. 3 Illustrates the principle of the polarization multiplexing of projected and captured light. The ambient scene is seen only by the camera, while the projected images are only perceived by eyes.
Fig. 4
Fig. 4 (a) Tabletop experimental prototype, and (b) close view of the head-mounted unit with two pico-projectors placed in a 3D printed housing and a camera (see Visualization 1).
Fig. 5
Fig. 5 (a) Eye-box size and overlap between left/right eye-box at 70cm working distance, and (b) Brightness, optical gain and crosstalk per eye when the distance between the screen and the viewer is changed.
Fig. 6
Fig. 6 (a) and (b) show a stereo 3D content displayed on screen when viewed through polarizer and without polarizer respectively, and (c) shows the perceived views captured from left eye, right eye and center positions respectively (see Visualization 2).
Fig. 7
Fig. 7 (a) Shows the perspective views of the scene captured in a single shot, (b) and (c) show two different reconstructed viewpoints relayed to a stereo display.
Fig. 8
Fig. 8 (a) Experimental setup for MTF measurement, (b) Captured edge profile on the camera sensor for different configurations, and (c) Measured MTF. With 10% cutoff limit, the imaging quality is reduced to one-third when a polarizer sheet is placed on top of mirror surface while 50% further loss is observed when the retro-reflective microspheres are patterned on polarizer.

Tables (2)

Tables Icon

Table 1 Polarization state of projected and ambient light at different surfaces of the screen.

Tables Icon

Table 2 Loss in transmitted light at different interfaces of the system.

Metrics