Abstract

We propose a multi-viewer tracking integral imaging system for viewing angle and viewing zone improvement. In the tracking integral imaging system, the pickup angles in each elemental lens in the lens array are decided by the positions of viewers, which means the elemental image can be made for each viewer to provide wider viewing angle and larger viewing zone. Our tracking integral imaging system is implemented with an infrared camera and infrared light emitting diodes which can track the viewers’ exact positions robustly. For multiple viewers to watch integrated three-dimensional images in the tracking integral imaging system, it is needed to formulate the relationship between the multiple viewers’ positions and the elemental images. We analyzed the relationship and the conditions for the multiple viewers, and verified them by the implementation of two-viewer tracking integral imaging system.

© 2009 OSA

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. T. Okoshi, Three-Dimensional Imaging Techniques (Academic Press, New York, 1976).
  2. B. Lee, J.-H. Park, and S.-W. Min, “Three-dimensional display and information processing based on integral imaging,” in Digital Holography and Three-Dimensional Display, T.-C. Poon, ed. (Springer, 2006), Chap. 12, 333–378.
  3. J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40(29), 5217–5232 (2001).
    [CrossRef]
  4. J.-H. Park, S. Jung, H. Choi, and B. Lee, “Integral imaging with multiple image planes using a uniaxial crystal plate,” Opt. Express 11(16), 1862–1875 (2003).
    [CrossRef] [PubMed]
  5. J.-S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett. 28(20), 1924–1926 (2003).
    [CrossRef] [PubMed]
  6. D.-H. Shin and E.-S. Kim, “Computational integral imaging reconstruction of 3D object using a depth conversion technique,” J. Opt. Soc. Korea 12(3), 131–135 (2008).
    [CrossRef]
  7. M.-O. Jeong, N. Kim, and J.-H. Park, “Elemental image synthesis for integral imaging using phase-shifting digital holography,” J. Opt. Soc. Korea 12(4), 275–280 (2008).
    [CrossRef]
  8. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
    [CrossRef]
  9. S. Jung, J. Hong, J.-H. Park, Y. Kim, and B. Lee, “Depth-enhanced integral-imaging 3D display using different optical path lengths by polarization devices or mirror barrier array,” J. Soc. Inf. Disp. 12(4), 461–467 (2004).
    [CrossRef]
  10. H. Liao, M. Iwahara, Y. Katayama, N. Hata, and T. Dohi, “Three-dimensional display with a long viewing distance by use of integral photography,” Opt. Lett. 30(6), 613–615 (2005).
    [CrossRef] [PubMed]
  11. R. Martínez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. Martínez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Opt. Express 15(24), 16255–16260 (2007).
    [CrossRef] [PubMed]
  12. Y. Kim, H. Choi, J. Kim, S.-W. Cho, Y. Kim, G. Park, and B. Lee, “Depth-enhanced integral imaging display system with electrically variable image planes using polymer-dispersed liquid-crystal layers,” Appl. Opt. 46(18), 3766–3773 (2007).
    [CrossRef] [PubMed]
  13. J.-H. Park, J. Kim, Y. Kim, and B. Lee, “Resolution-enhanced three-dimension / two-dimension convertible display based on integral imaging,” Opt. Express 13(6), 1875–1884 (2005).
    [CrossRef] [PubMed]
  14. C. Cruz-Naira, D. J. Sandin, and T. A. DeFanti, “Surround-screen projection-based virtual reality: the design and implementation of the CAVE,” Proc. SIGGRAPH, 135–142 (1993).
  15. M. Agrawala, A. C. Beers, B. Fröhlich, P. Hanrahan, I. McDowall, and M. Bolas, “The two-user responsive workbench: support for collaboration through individual views of a shared space,” Proc. SIGGRAPH, 327–332 (1997).
  16. Y. Kitamura, T. Nakayama, T. Nakashima, and S. Yamamoto, “The Illusionhole with polarization filters,” Proc. of the ACM Symposium on Virtual Reality Software and Technology, 244–251 (2006).
  17. R. Haussler, S. Reichelt, N. Leister, E. Zschau, R. Missbach, and A. Schwerdtner, “Large real-time holographic displays: from prototypes to a consumer product,” Proc. SPIE 7237, 72370S (2009).
    [CrossRef]
  18. A. Schwerdtner, N. Leister, R. Häussler, and S. Reichelt, “Eye-tracking solutions for real-time holographic 3-D display,” Soc. Inf. Display Digest (SID’08), 345–347 (2008).
  19. G. Park, J. Hong, Y. Kim, and B. Lee, “Enhancement of viewing angle and viewing distance in integral imaging by head tracking,” in Digital Holography and Three-Dimensional Imaging, OSA Technical Digest (Optical Society of America, 2009), DWB27.
  20. “OpenCV,” http://opencv.willowgarage.com/wiki .
  21. M. Martínez-Corral, B. Javidi, R. Martínez-Cuenca, and G. Saavedra, “Multifacet structure of observed reconstructed integral images,” J. Opt. Soc. Am. A 22, 597–603 (2005).
    [CrossRef]
  22. R. Martínez-Cuenca, G. Saavedra, A. Pons, B. Javidi, and M. Martínez-Corral, “Facet braiding: a fundamental problem in integral imaging,” Opt. Lett. 32(9), 1078–1080 (2007).
    [CrossRef] [PubMed]

2009 (1)

R. Haussler, S. Reichelt, N. Leister, E. Zschau, R. Missbach, and A. Schwerdtner, “Large real-time holographic displays: from prototypes to a consumer product,” Proc. SPIE 7237, 72370S (2009).
[CrossRef]

2008 (2)

2007 (3)

2006 (1)

A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
[CrossRef]

2005 (3)

2004 (1)

S. Jung, J. Hong, J.-H. Park, Y. Kim, and B. Lee, “Depth-enhanced integral-imaging 3D display using different optical path lengths by polarization devices or mirror barrier array,” J. Soc. Inf. Disp. 12(4), 461–467 (2004).
[CrossRef]

2003 (2)

2001 (1)

Cho, S.-W.

Choi, H.

Dohi, T.

Hata, N.

Haussler, R.

R. Haussler, S. Reichelt, N. Leister, E. Zschau, R. Missbach, and A. Schwerdtner, “Large real-time holographic displays: from prototypes to a consumer product,” Proc. SPIE 7237, 72370S (2009).
[CrossRef]

Hong, J.

S. Jung, J. Hong, J.-H. Park, Y. Kim, and B. Lee, “Depth-enhanced integral-imaging 3D display using different optical path lengths by polarization devices or mirror barrier array,” J. Soc. Inf. Disp. 12(4), 461–467 (2004).
[CrossRef]

Iwahara, M.

Jang, J.-S.

Javidi, B.

Jeong, M.-O.

Jung, S.

Katayama, Y.

Kim, E.-S.

Kim, J.

Kim, N.

Kim, Y.

Lee, B.

Leister, N.

R. Haussler, S. Reichelt, N. Leister, E. Zschau, R. Missbach, and A. Schwerdtner, “Large real-time holographic displays: from prototypes to a consumer product,” Proc. SPIE 7237, 72370S (2009).
[CrossRef]

Liao, H.

Martínez-Corral, M.

Martínez-Cuenca, R.

Min, S.-W.

Missbach, R.

R. Haussler, S. Reichelt, N. Leister, E. Zschau, R. Missbach, and A. Schwerdtner, “Large real-time holographic displays: from prototypes to a consumer product,” Proc. SPIE 7237, 72370S (2009).
[CrossRef]

Navarro, H.

Park, G.

Park, J.-H.

Pons, A.

Reichelt, S.

R. Haussler, S. Reichelt, N. Leister, E. Zschau, R. Missbach, and A. Schwerdtner, “Large real-time holographic displays: from prototypes to a consumer product,” Proc. SPIE 7237, 72370S (2009).
[CrossRef]

Saavedra, G.

Schwerdtner, A.

R. Haussler, S. Reichelt, N. Leister, E. Zschau, R. Missbach, and A. Schwerdtner, “Large real-time holographic displays: from prototypes to a consumer product,” Proc. SPIE 7237, 72370S (2009).
[CrossRef]

Shin, D.-H.

Stern, A.

A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
[CrossRef]

Zschau, E.

R. Haussler, S. Reichelt, N. Leister, E. Zschau, R. Missbach, and A. Schwerdtner, “Large real-time holographic displays: from prototypes to a consumer product,” Proc. SPIE 7237, 72370S (2009).
[CrossRef]

Appl. Opt. (2)

J. Opt. Soc. Am. A (1)

J. Opt. Soc. Korea (2)

J. Soc. Inf. Disp. (1)

S. Jung, J. Hong, J.-H. Park, Y. Kim, and B. Lee, “Depth-enhanced integral-imaging 3D display using different optical path lengths by polarization devices or mirror barrier array,” J. Soc. Inf. Disp. 12(4), 461–467 (2004).
[CrossRef]

Opt. Express (3)

Opt. Lett. (3)

Proc. IEEE (1)

A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
[CrossRef]

Proc. SPIE (1)

R. Haussler, S. Reichelt, N. Leister, E. Zschau, R. Missbach, and A. Schwerdtner, “Large real-time holographic displays: from prototypes to a consumer product,” Proc. SPIE 7237, 72370S (2009).
[CrossRef]

Other (8)

A. Schwerdtner, N. Leister, R. Häussler, and S. Reichelt, “Eye-tracking solutions for real-time holographic 3-D display,” Soc. Inf. Display Digest (SID’08), 345–347 (2008).

G. Park, J. Hong, Y. Kim, and B. Lee, “Enhancement of viewing angle and viewing distance in integral imaging by head tracking,” in Digital Holography and Three-Dimensional Imaging, OSA Technical Digest (Optical Society of America, 2009), DWB27.

“OpenCV,” http://opencv.willowgarage.com/wiki .

C. Cruz-Naira, D. J. Sandin, and T. A. DeFanti, “Surround-screen projection-based virtual reality: the design and implementation of the CAVE,” Proc. SIGGRAPH, 135–142 (1993).

M. Agrawala, A. C. Beers, B. Fröhlich, P. Hanrahan, I. McDowall, and M. Bolas, “The two-user responsive workbench: support for collaboration through individual views of a shared space,” Proc. SIGGRAPH, 327–332 (1997).

Y. Kitamura, T. Nakayama, T. Nakashima, and S. Yamamoto, “The Illusionhole with polarization filters,” Proc. of the ACM Symposium on Virtual Reality Software and Technology, 244–251 (2006).

T. Okoshi, Three-Dimensional Imaging Techniques (Academic Press, New York, 1976).

B. Lee, J.-H. Park, and S.-W. Min, “Three-dimensional display and information processing based on integral imaging,” in Digital Holography and Three-Dimensional Display, T.-C. Poon, ed. (Springer, 2006), Chap. 12, 333–378.

Supplementary Material (2)

» Media 1: MOV (878 KB)     
» Media 2: MOV (496 KB)     

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (14)

Fig. 1
Fig. 1

Viewing zones in (a) conventional integral imaging system and (b) tracking integral imaging system.

Fig. 2
Fig. 2

Viewing angle of each lens in a lens array to pick up objects in space.

Fig. 3
Fig. 3

Overlap of elemental images of three viewers in tracking integral imaging system.

Fig. 4
Fig. 4

Notations for the positions of elemental images in (a) real mode and (b) virtual mode in tracking integral imaging system.

Fig. 5
Fig. 5

Elemental images on elemental image plane for two viewers in tracking integral imaging system.

Fig. 6
Fig. 6

Range of the position of n-th elemental image for secondary viewer for no overlap with the elemental image for primary viewer in (a), (b) real mode and (c), (d) virtual mode.

Fig. 7
Fig. 7

Viewing zone without overlap with secondary viewer with respect to the primary viewer’s position. (a), (b) when magnification ratio is 3, 5, (c), (d) when focal length of a lens is 10 mm, 20 mm, (e), (f) when primary viewer’s distance from a lens array on z-axis is 1.5 m, 2.5 m.

Fig. 8
Fig. 8

Experimental setup with a small pixel pitch LCD monitor with a lens array and an infrared camera.

Fig. 9
Fig. 9

3D characters (Comic Sans MS font) used in experiments and elemental images. (a) 3D characters for a primary viewer, (b) 3D characters for a secondary viewer, (c), (d) elemental images for the 3D characters in conventional integral imaging system, (e), (f) elemental images made in tracking integral imaging system.

Fig. 10
Fig. 10

Tracking result when two viewers are in the positions for no overlap. 0, 1, 2, 3 mean IR LED points. Blue rectangle is inner boundary and orange rectangle is outer boundary for no overlap.

Fig. 11
Fig. 11

Elemental images for two viewers in the positions for no overlap.

Fig. 12
Fig. 12

Integrated images with no overlap: (a) images captured from 7 positions and (b) corresponding movie (Media 1).

Fig. 13
Fig. 13

Tracking result when two viewers are in the positions for overlap.

Fig. 14
Fig. 14

Experimental results with the overlap condition: (a) elemental images for two viewers in the positions for overlap, (b) an integrated image and (c) corresponding movie (Media 2).

Tables (1)

Tables Icon

Table 1 Experimental specification

Equations (21)

Equations on this page are rendered with MathJax. Learn more.

An=gVyVz+(n12)PLg1Vz+(n12)PL,
AnAn1¯=PL(1+gVZ).
Bn=LVyVz(n12)PLL1Vz+(n12)PL,
BnBn1¯=PL(1LVZ).
Cn1=gL(Bn+1nPL)+(n1)PL,
Cn2=gL(BnnPL)+(n1)PL,
Cn2Cn1¯=|gLPL(1LVZ)|.
Cn1>C'n2,
C'n1>C(n1)2.
Vy1+PL(VZL1)<Vy2<Vy1+PL(VZgVZL+2).
C(n+1)1>C'n2,
C'n1>Cn2.
Vy1PL(VzgVzL+2)<Vy2<Vy1PL(VzL1).
PL(VzL1)<ΔVy<PL(VzgVzL+2).
Cn2>C'n1,
C'n2>C(n1)1.
Vy1+PL(1VzL)<Vy2<Vy1+PL(Vzg+VzL).
C(n+1)2>C'n1,
C'n2>Cn1.
Vy1PL(Vzg+VzL)<Vy2<Vy1PL(1VzL),
PL(1VzL)<ΔVy<PL(Vzg+VzL).

Metrics